slide1 l.
Download
Skip this Video
Download Presentation
Computational Algebraic Problems in Variational PDE Image Processing

Loading in 2 Seconds...

play fullscreen
1 / 108

Computational Algebraic Problems in Variational PDE Image Processing - PowerPoint PPT Presentation


  • 118 Views
  • Uploaded on

Computational Algebraic Problems in Variational PDE Image Processing. Tony F. Chan Department of Mathematics, UCLA International Summer School in Numerical Linear Algebra

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Computational Algebraic Problems in Variational PDE Image Processing' - fearghus


Download Now An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

Computational Algebraic Problems in

Variational PDE Image Processing

Tony F. Chan

Department of Mathematics, UCLA

International Summer School in Numerical Linear Algebra

Chinese University of Hong Kong

Reports:www.math.ucla.edu/~imagers

Supported by ONR, NIH and NSF

slide2

Collaborators

Peter Blomgren (Stanford)

Raymond Chan (CUHK)

Gene Golub (Stanford)

Pep Mulet (Valencia)

Jackie Shen (Minnesota)

Luminita Vese (UCLA)

Justin Wan (Waterloo)

C.K. Wong

Jamylle Carter (MSRI)

Berta Sandberg (TechFinity)

Ke Chen (Liverpool)

Xue-Cheng Tai (Bergen, Norway)

Jean-Francois Aujol (Cachan)

Selim Esedoglu (Michigan)

Fred Park (UCLA -> Michigan)

slide3

Outline of 4 Lectures

  • Introduction to PDE Image Models & Algorithms:
  • Denoising, deblurring, active contours, segmentation
  • 2. Algebraic Problems from Denoising & Deblurring
  • Linear preconditioning techniques
  • Nonlinear + Optimization (duality) techniques
  • Multigrid techniques
  • Algebraic Problems from Active Contours/Segmentation
    • Curve evolution techniques
    • Direct optimization techniques
goal of lectures
Goal of Lectures
  • Broad overview rather than latest techniques
  • Details on only a few topics --- see papers + reports + web
  • No comprehensive referencing
  • Limit to PDE aspects (there are non-PDE approaches; see forthcoming SIAM book by Hansen, Nagy, O’Leary)
  • Please ask questions (English, Cantonese, Mandarin)
slide5

TYPICAL IMAGE PROCESSING TASKS

* Denoising/Inpaint * Object Detection/Identification

* Deblurring * Object/Pattern Recognition

* Enhancement * Gray-Scale vs Vector-Valued

* Compression * Still vs Video

* Segmentation * Registration

Related fields: Computer Graphics, Computer Vision.

ip and applied math
IP And Applied Math
  • Important applications:
    • Medical, astronomy, Comp. Vision/Comp. Graphics,
  • Math Models:
    • standard or create your own
  • Math Tools:
    • harmonic analysis, PDEs, Baysean statistics, differential geometry, CFD, multiscale, optimization,…
  • Analysis of Models:
    • existence, uniqueness, properties
  • Challenging Computations
    • 3D+time+multi-components, nonlinearity, non-smooth
slide7

Examples of PDE Image Models

Denoising and Inpainting

slide8

The Restoration Problem

A given Observed imagez

Related to True Imageu

Through BlurK

And Noisen

Initial

Blur

Blur+Noise

Inverse Problem: restore u, given K and statistics for n.

Ill-posed: needs proper regularization.

Keeping edges sharp and in the correct location is a key problem !

total variation regularization
Total Variation Regularization
  • Measures “variation” of u, w/o penalizing discontinuities.
  • |.| similar to Huber function in robust statistics.
  • 1D: If u is monotonic in [a,b], then TV(u) = |u(b) – u(a)|, regardless of whether u is discontinuous or not.
  • nD: If u(D) = char fcn of D, then TV(u) = “surface area” of D.
  • (Coarea formula)
  • Thus TV controls both size of jumps and geometry of boundaries.
  • Extensions to vector-valued functions
    • Color TV: Blomgren-C 98; Ringach-Sapiro, Kimmel-Sochen
slide10

Total Variation Restoration

Regularization:

Variational Model:

* First proposed by Rudin-Osher-Fatemi ’92.

* Allows for edge capturing (discontinuities along curves).

* TVD schemes popular for shock capturing.

Gradient flow:

anisotropic diffusion

data fidelity

inpaintings generalized restoration models
Inpaintings: Generalized Restoration Models

Disocclusion

Scratch Removal

Graffiti Removal

examples of tv inpaintings
Examples of TV Inpaintings

Where is the Inpainting Region?

slide16

Examples of PDE Image Models

Blind (unknown blur) and Non-blind Deburring

slide17

Deblurring

Original Image Out of focus blur Blurred Image

- Blurring operator K usually ill-conditioned

- Need to solve systems with differential-convolution operator:

slide19

TV Blind Deconvolution

C. + Wong (98)

* Variational Model:

* Alternating minimization algorithm:

  • * Algorithm gives 1-parameter family of solutions,
  • determined by SNR.
slide20

TV Blind deconvolution

Recovered image

Recovered Blurring Function

0 1e-7 1e-5 1e-4

slide21

Original image

Out of focus blurred blind non-blind

Gaussianblurred blind non-blind

slide22

Examples of PDE Image Models

Active Contours & Segmentation

slide23

Active Contour w/o Edges (C.-Vese 99)

Features:

Automatically detects interior contours!

Works very well for concave objects

Robust w.r.t. noise

Detects blurred contours

The initial curve can be placed anywhere!

Allows for automatical change of topolgy

Objects Found

slide24

MRI brain image

Europe nightlights

The model detects contours without gradient (cognitive contours)

motion segmentation moelich chan 2004
Motion Segmentation (Moelich, Chan 2004)

Motion determined by logical AND & OR on frame differences over 3 consecutive frames.

Designed for low frame rate videos.

Another extension of Chan-Vese 2001 model.

Implemented via level sets.

Westwood Blvd

Olympic Blvd, LA

(2 frames per sec)

UCLA

(1 frame per sec)

slide26

Other Related PDE Image Models

* Geometry driven diffusion (see book by Bart M. ter Haar Romeny 1994)

* Anisotropic diffusion (Perona-Malik 87)

* Fundamental IP PDE (Alvarez-Guichard-Lions-Morel 92)

Affine invariant flow (Sapiro-Tanenbaum 93)

* TV + Textures (Meyer 2001, Osher-Vese 02, Osher-Sole-Vese 02, Osher-Sapiro-Vese 02, Chambolle 03)

slide27

Different Frameworks for Image Processing

Statistical/Stochastic Models:

Maximum Likelihood Estimation with uncertain data

Transform-Based Models:

Fourier/Wavelets --- process features of images (e.g. noise)

in transform space (e.g. thresholding)

Variational PDE Models:

Evolve image according to local derivative/geometric info,

e.g. denoising  diffusion

Concepts are related mathematically:

Brownian motion – Fourier Analysis --- Diffusion Equation

slide28

Features & Advantages of PDE Imaging Models

* Use PDE concepts: gradients, diffusion, curvature, level sets

* Exploit sophisticated PDE and CFD (e.g. shock capturing) techniques

Restoration:

- sharper edges, less edge artifacts, often morphological

Segmentation:

- scale adaptivity, geometry-based, controlled regularity of boundaries, segments can have complex topologies

Newer, less well developed/accepted.

Combining PDE with other techniques.

slide29

Computational challenges

* size: large # of pixels, color and multi-channel, 3D, videos.

* TV(u)non-differentiable if

need numerical regularization:

* High nonlinearity of

* Ill-conditioning:

* Highly varying coefficients:

* Need to precondition differential + convolution operators.

slide30

Some Books/Surveys for PDE Imaging

  • Morel-Solimini 94: Variational Meths in Image Segmentation
  • Romeny 94: Geometry Driven Diffusion in Computer Vision
  • Alvarez-Morel 94: Acta Numerica (review article)
  • IEEE Tran. Image Proc. 3/98, Special Issue
  • J. Weickert 98: Anisotropic Diffusion in Image Processing
  • G. Sapiro 2000: Geometric PDE & Imaging
  • Aubert-Kornprost 2002: Math Aspects of Image Processing
  • Osher-Fedkiw 2003: “Level Set Bible”
  • Chan, Shen & Vese Jan 03, Notices of AMS (review)
  • Paragios, Chen, Faugeras 2005: Collection of articles
  • Chan-Shen 2005: Image Processing & Analysis
slide32

Outline of 4 Lectures

  • Introduction to PDE Image Models & Algorithms:
  • Denoising, deblurring, active contours, segmentation
  • 2. Algebraic Problems from Denoising & Deblurring
  • Linear preconditioning techniques
  • Nonlinear + Optimization (duality) techniques
  • Multigrid techniques
  • Algebraic Problems from Active Contours/Segmentation
    • Curve evolution techniques
    • Direct optimization techniques
slide33

Restoration Problem in Discrete Algebraic Form

where denotes the discrete gradient , divergence.

Gradient Flow:

slide35

Morphological Diagonal Scaling (Marquina, Osher 2000)

  • Two different but related motivations:
  • morphological evolution of level sets --- moves in direction of normal with speed proportional to curvature, independent of contrast.
  • diagonally scaled Richardson stationary iteration.
  • Advantages:
  • cost per time step similar to time marching
  • much faster convergence to steady state
slide38

Unconstrained Modular Solver for Discrepancy Principle

Blomgren-C. 96

Discrepancy Principle Constrained Problem:

min R(u)

subject to

u

Tikhonov Unconstrained Problem:

Modular Solver: efficient solver for fixed

(e.g. Time Marching, Fixed Point, Primal-Dual.)

- Can make use of S to solve constrained problem efficiently.

- Based on block-elimination + Newton’s method for constrained problem via calls to S for computing directional derivatives.

a modular newton s method
A modular Newton’s Method

System of nonlinear equations: G(u, ) = 0; N(u, ) = 0.

Newton’s Method:

Gu G u - G

=

Nu N - N

Block elimination:

Define w = - Gu-1G; v = Gu-1G

 = - (N – Nu v)-1 (Nu w + N)

Main idea: Replace computation of w & v by calls to S(u,)

slide43

Outline of 4 Lectures

  • Introduction to PDE Image Models & Algorithms:
  • Denoising, deblurring, active contours, segmentation
  • 2. Algebraic Problems from Denoising & Deblurring
  • Linear preconditioning techniques
  • Nonlinear + Optimization (duality) techniques
  • Multigrid techniques
  • Algebraic Problems from Active Contours/Segmentation
    • Curve evolution techniques
    • Direct optimization techniques
difficulties with primal tv
Difficulties with Primal TV
  • TV Norm Non Differentiable ) Regularize Functional:
  • Primal: Gradient Flow Also Needs Regularization :
  • Problem Becomes difficult for Small 
  • But edges smeared for large 
  • Artificial Time Marching at best Linearly Convergent
  • Nonlinear relaxation (e.g. GS) non-convergent w/o 
  • Ill-conditioning due to spatial scales (CFL; MG)
towards quadratically convergent methods
Towards Quadratically Convergent Methods

Newton’s Method [R. Chan, T. Chan, Zhou, ‘95]

  • As ! 0 ) Size of Domain of Convergence ! 0
  • Tied Continuation on .
  • But Efficient Continuation Not Easy to Obtain!
slide46

Primal-Dual Method:(Chan, Golub, and Mulet ‘95)

Instead of u system:

Introduce Auxiliary (Dual Variable) w:=ru / |ru| , Note |w|=1.

Linearize the (w,u)-system:

  • Similar to primal-dual (Conn & Overton ’95, Anderson ’95)
slide47

Why Linearization Works Well

  • w =ru/|ru| Normal to the Level Sets of u
  • r ¢ w =  Curvature of Level Sets
  • Time Marching Regarded as Curvature Driven Flow
  • Better Global Convergence Of Newton Meth for (w,u)- system: (w,u) system more “globally linear” than u-system
slide48

Linearized Systems

  • Linearization of u-system
  • Linearization of (w,u)-system
  • Similar Structure and Cost for Both Systems
cgm method convergence results
CGM Method: Convergence Results

Residual vs. Iterations

||un-utrue|| vs. Iterations

Relatively robust wrt ; but iteration # increases as  -> 0.

introducing primal dual j carter 01
Introducing Primal-Dual (J. Carter ’01)

TV Model:

Rewrite By Introducing Dual Variable  :

Swap inf & sup (G strict. convex in u and concave (linear) in  ):

For each , Solution to infu G(u,):

Back Substitute u, and use:

primal dual dual
Primal-Dual ) Dual

Problem Reduces to the Dual Problem:

Objective quadratic; but many constraints

Optimality Conditions (Discrete Case):

Complementarity:

Update for u:

slide57

Potential Difficulties To Dual Problem

  • Many Constraints
  • Use Standard Constrained Optimization Meths
    • [Carter, Vandenberghe ‘02] Barrier Methods
    • Interior Point Methods?
    • Penalty Methods?
  • But need to estimate algorithmic parameters
  • Other related ideas:
    • Second order cone programming [Yin, Goldfarb, Osher]
    • Graph Cut [Zabih, Boykov, Komogorov,..]
chambolle s key observation 04
Chambolle’s Key Observation (‘04)

Optimality Conditions from Dual TV Formulation:

Complementarity:

Key Observation:

Lagrange Multipliers Eliminated!

elimination of lagrange multipliers
Elimination of Lagrange Multipliers

Thus, Lagrangian Simplifies To:

Solve via Semi-Implicit Gradient Descent (Chambolle):

Reduces to Explicit Scheme:

convergence of scheme
Convergence of Scheme

Theorem:

(Chambolle)

  • Iterates Decrease Energy
  • Empirically: # iter ~ # pixels (2-D)
  • Convergence: At Best Linear
  • No  Parameter Needed
  • Globally Convergent for any p0
rof dual residual vs iterations

ROF Dual: Residual vs #Iterations

Time Marching

Observed image

Chamb Dual

Residuals

CD

TM

TM

1500 iters

CD

250 iters

# Iterations

iterations versus
#Iterations versus 

# iterations

Values of 

iterations versus image size
#Iterations versus Image Size

# iterations

Size of image

non smooth newton methods
Non-Smooth Newton Methods
  • Recent works by M. Ng, Qi 2005
  • Chambolle’s equation is non-differentiable:
  • Non-smooth Newton: uses sub-gradients near singularities
  • Superlinear convergence is achievable in theory
slide65

Outline of 4 Lectures

  • Introduction to PDE Image Models & Algorithms:
  • Denoising, deblurring, active contours, segmentation
  • 2. Algebraic Problems from Denoising & Deblurring
  • Linear preconditioning techniques
  • Nonlinear + Optimization (duality) techniques
  • Multigrid techniques
  • Algebraic Problems from Active Contours/Segmentation
    • Curve evolution techniques
    • Direct optimization techniques
time marching vs fixed point
Time Marching vs Fixed Point
  • In image denoising, TV regularization approach leads to:
  • In fixed-pt iteration, if one fixes |ux| = |uxn| and u - u0 = un - u0, one needs to solve:
  • Apply 1 step of Richardson with relaxation parameterDt & precond.B:
time marching vs fixed point cont
Time Marching vs Fixed Point (cont.)
  • B = 1 time marching scheme by Rudin-Osher-Fatemi (92):
  • B = |uxn| time marching scheme by Marquina-Osher (00):

i.e. diagonal precond. Richardson.

  • In general, one can choose other B, e.g. multigrid, to speed up convergence.
  • If 1 pre- & 1 post- GS smoothing are used, 1 MG cycle ~ 4 time marching steps.
slide71

Comparison of Preconditioners

Inner iteration: 1 Richardson step with diag or MG preconditioner

slide73

Exact FP vs Inexact FP

Inexact FP: 1 MG V-cycle, Exact FP: ~10 MG V-cycle

a demonstration
A demonstration

Show video of X-ray of hand

slide89

Multigrid for Differential+Convolution Problems

  • Linear MG, V-cycles, various smoothers (R. Chan, C., Wan 97)
  • > works well for I, Laplacian but not as well for TV.
  • > difficult to find good smoother for diff-conv problems
  • > spectral properties of diff and conv operators “flipped”.
  • Multilevel Additive Schwarz (Hanke, Vogel 98)
  • > 2 grid levels, projection of diff-conv operator directly to
  • a coarse grid.
  • > coarse problem dense, solved using direct method.
spectrum of a d k
Spectrum of - aD + K

a = 1

a = 10-4

a = 10-8

smallest e.v.

middle e.v.

largest e.v.

richardson smoothing
Richardson Smoothing

a = 1

a = 10-4

a = 10-8

initial error

1 iter

5 iter

10 iter

slide92

Outline

  • PDE Image Models:
  • Denoising, deblurring, active contours, segmentation
  • 2. Algebraic Problems from Denoising & Deblurring
  • Nonlinear + Optimization techniques
  • Linear preconditioning techniques
  • Algebraic Problems from Active Contours/Segmentation
slide93

Active Contour w/o Edges (C.-Vese 99)

Features:

Automatically detects interior contours!

Works very well for concave objects

Robust w.r.t. noise

Detects blurred contours

The initial curve can be placed anywhere!

Allows for automatical change of topolgy

Objects Found

slide94

An Active Contour model “without edges”

Fitting + Regularization terms (length, area)

Connection with Segmentation:

Active contour model partitions the image into 2 segments –

inside and outside.

slide95

Level Sets (Osher - Sethian ‘87)

Inside C

Outside C

Outside C

C

* Allows automatic topology changes, cusps, merging and breaking.

slide96

Main Evolutionary Equation in Active Contour Model

  • Possible Approaches:
  • Time marching using explicit or implicit schemes
  • Solve for Steady State directly
  • Use pointwise-relaxation for linear systems

- Linearize curvature term using FP:

fast algorithms for level set segmentation
Fast Algorithms for Level Set Segmentation
  • Implicit methods (CV ’01): allow larger time steps
  • Multigrid (Fedkiw, C, Kang, Vese ’01, Tsai, Wilsky, Yezzi ‘00):
    • interpolate LSF from coarse grid as initial guess for fine grid
  • Direct Optimization (Song-C ’02):
    • sweep through pixels, decide pixel’s region membership by value of energy functional.
slide98

Twolinear schemes (fixed point)

Evolutionary iterative scheme

Stationary iterative scheme

Typically, we use only 1 step of Gauss-Seidel relaxation for

slide99

Comparison of the evolutionary/stationary schemes

Evolutionary Scheme (CPU time = 59.13 sec)

0 It 50 It 500 It 1000 It 2000 It 4000 It

Stationary Scheme (CPU time = 0.63 sec)

0 It 10 It 20 It 30 It 40 It 50 It

slide100

Multigrid Ideas For Active Contours

  • Use MG for solving the linear systems arising in evolution
  • Use MG for solving nonlinear steady state equations
  • Use full MG to obtain better initial guess for curve:
  • > Down-sample image to lower resolution
  • > Solve active contour problem on low resolution image
  • > Interpolate level set function to fine resolution image.
  • smooth  good approximation from low resolution.
  • > Evolution on fine resolution image picks up details.
  • Refs: Tsai, Willsky, Yezzi 2000, C., Fedkiw, Kang, Vese 2000
slide101

Multigrid for Active Contours

Original image

256x171

32x22

64x43

128x86

256x171

slide102

Animation of Multigrid Active Contours

4 levels: 256x171, 128x86, 64x43, 32x22

fast direct search algorithm
Fast Direct Search Algorithm

(Song-C ’02)

Insight: Segmentation only needs sign of LSF but not its value

  • Initialization. Partition domain into and
  • Advance. For each point x in the domain, if the energy F lower when we change to , then update this point. (Can be updated fast.)
  • Repeat step 2 until energy F remains unchanged

(Related to K-mean algorithm, and “region merging” algorithm of Koepfler, Lopez, Morel ’94)

a 2 phase example
A 2-phase example

(a), (b), (c), (d) are four different initial condition.

All of them converge in one sweep!

example with noise
Example with Noise

Converged in 4 steps.

(Gradient Descent on Euler-Lagrange took > 400 steps.)

convergence of the algorithm
Convergence of the algorithm

Theorem: For 2-phase images, algorithm (w/o length term) converges in 1 sweep, independent of sweeping order.

Why is 1-step covergence is possible?

Problem is global: usually cannot have finite step convergence based on local updates only

But, in our case, we can exactly calculate the global energy change via local update (can update global average locally)

application to piecewise linear cv model vese 2002
Application to piecewise linear CV model(Vese 2002)

Original

P.W.Constant

Converged in 4 steps

P.W. Linear

Converged in 6 steps

other fast algorithms for level set segmentation
Other Fast Algorithms for Level Set Segmentation
  • Narrow Band (see Osher-Fedkiw ’03):
    • only solve PDE near zero LS
  • Operator Splitting (Gibou-Fedkiw ’02):
    • split length term (nonlinearly diffuse) from fidelity term (opt via k-mean).
  • Threshold Dynamics (Esedoglu & Tsai ’04, +Ruuth ‘05):
    • Extends Merriman, Bence, Osher ’92 diffusion generated motion by mean curvature to MS segmentation. Alternates diffusion with thresholding.
    • Operator split phase field formulation of Mumford-Shah functional