a motivating application sensor array signal processing
Download
Skip this Video
Download Presentation
A Motivating Application: Sensor Array Signal Processing

Loading in 2 Seconds...

play fullscreen
1 / 10

A Motivating Application: Sensor Array Signal Processing - PowerPoint PPT Presentation


  • 150 Views
  • Uploaded on

Forward. Inverse. A Motivating Application: Sensor Array Signal Processing. Goal: Estimate directions of arrival of acoustic sources using a microphone array. Data collection setup. Underlying “sparse” spatial spectrum f *. [. ]. Underdetermined Linear Inverse Problems.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' A Motivating Application: Sensor Array Signal Processing' - amalia


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
a motivating application sensor array signal processing

Forward

Inverse

A Motivating Application:Sensor Array Signal Processing
  • Goal: Estimate directions of arrival of acoustic sources using a microphone array

Data collection setup

Underlying “sparse” spatial spectrum f*

underdetermined linear inverse problems

[

]

Underdetermined Linear Inverse Problems
  • Basic problem: find an estimate of , where
  • Underdetermined -- non-uniqueness of solutions
  • Additional information/constraints needed for a unique solution
  • A typical approach is the min-norm solution:
  • What if we know is sparse (i.e. has few non-zero elements)?
sparsity constraints

Number of non-zero elements in f

Sparsity constraints
  • Prefer the sparsest solution:
  • Can be viewed as finding a sparse representation of

the signal y in an overcomplete dictionary A

  • Intractable combinatorial optimization problem
  • Are there tractable alternatives that might produce the same result?
  • Empirical observation:l1-norm-based techniques produce solutions that look sparse
    • l1 cost function can be optimized by linear programming!
l 1 norm and sparsity a simple example
l1-norm and sparsity – a simple example

A sparse signal

1.4142

2.0000

A non-sparse signal

0.5816

3.5549

  • Goal: Rigorous characterization of the l1 – sparsity link

For these two signals f1 and f2 we have A*f1=A*f2 where A is a 16x128 DFT operator

l 0 uniqueness conditions

Number of non-zero elements in f

  • Thm. 1:
  • What can we say about more tractable formulations like l1 ?

where

and K(A) is the largest integer such that any set of K(A) columns of A is linearly independent.

Unique l0solution

l0 uniqueness conditions
  • Prefer the sparsest solution:
  • Let where
  • When is ?
l 1 equivalence conditions

Thm. 2(*):

    • is sparse enough exact solution by l1 optimization
  • Can solve a combinatorial optimization problem by convex optimization!

where

(*) Donoho and Elad obtained a similar result concurrently.

l1solution = l0solution !

l1 equivalence conditions
  • Consider the l1 problem:
  • Can we ever hope to get ?
l p p 1 equivalence conditions

Thm. 3:

where

lpsolution = l0solution !

  • Smaller p  more non-zero elements tolerated
  • As p0 we recover the l0 condition, namely

Smaller p

lp (p ≤ 1) equivalence conditions
  • Consider the lp problem:
  • How about ?
l 0 uniqueness conditions1

Number of non-zero elements in f

  • Definition: The index of ambiguity K(A) of A is the largest integer such that any set of K(A) columns of A is linearly independent.
  • Thm. 1:
  • What can we say about more tractable formulations like l1 ?

Unique l0solution

l0 uniqueness conditions
  • Prefer the sparsest solution:
  • Let
  • When is ?
l 1 equivalence conditions1

Definition: Maximum absolute dot product of columns

  • Thm. 2(*):
    • is sparse enough exact solution by l1 optimization
  • Can solve a combinatorial optimization problem by convex optimization!

(*) Donoho and Elad obtained a similar result concurrently.

l1solution = l0solution !

l1 equivalence conditions
  • Consider the l1 problem:
  • Can we ever hope to get ?
l p p 1 equivalence conditions1

Definition:

  • Thm. 3:

Smaller p

lpsolution = l0solution !

  • Smaller p  more non-zero elements tolerated
  • As p0 we recover the l0 condition, namely
lp (p ≤ 1) equivalence conditions
  • Consider the lp problem:
  • How about ?
ad