A convergent solution to tensor subspace learning
Sponsored Links
This presentation is the property of its rightful owner.
1 / 17

A Convergent Solution to Tensor Subspace Learning PowerPoint PPT Presentation


  • 151 Views
  • Uploaded on
  • Presentation posted in: General

A Convergent Solution to Tensor Subspace Learning. Tensor Subspace Learning . Concept. Tensor: multi-dimensional (or multi-way) arrays of components . Tensor Subspace Learning . Application. real-world data are affected by multifarious factors.

Download Presentation

A Convergent Solution to Tensor Subspace Learning

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


A Convergent Solution to Tensor Subspace Learning


Tensor Subspace Learning . Concept

  • Tensor: multi-dimensional (or multi-way) arrays of components

Concept


Tensor Subspace Learning . Application

  • real-world data are affected by multifarious factors

for the person identification, we may have facial images of different

► views and poses

► lightening conditions

► expressions

► image columns and rows

  • the observed data evolve differently along the variation of different factors

Application


Tensor Subspace Learning . Application

  • it is desirable to dig through the intrinsic connections among different affection factors of the data.

  • Tensor provides a concise and effective representation.

Images

Image columns

expression

pose

Image rows

Illumination

Application


Traditional Tensor Discriminant algorithms

  • Two-dimensional Linear Discriminant Analysis

Ye et.al

  • Discriminant Analysis with Tensor Representation

Yan et.al

  • Tensor Subspace Analysis

He et.al

  • project the tensor along different dimensions or ways

  • solve an trace ratio optimization problem

  • projection matrices for different dimensions are derived iteratively

  • DO NOT CONVERGE !

Tensor Subspace Learning algorithms


Graph Embedding – a general framework

  • An undirected intrinsic graph G={X,W} is constructed to represent the pairwise similarities over sample data.

  • A penalty graph or a scale normalization item is constructed to impose extra constraints on the transform.

penalty graph

intrinsic graph

Tensor Subspace Learning algorithms


Discriminant Analysis Objective

  • No closed form solution

Solve the projection matrices iteratively: leave one projection matrix as variable while keeping others as constant.

Mode-k unfolding of the tensor


Discriminant Analysis Objective

Trace Ratio: General Formulation for the objectives of the Discriminant Analysis based Algorithms.

Between Class Scatter of the unfolded data

Within Class Scatter of the unfolded data

DATER:

Constructed from Image Manifold

TSA:

Diagonal Matrix with weights

Objective Deduction


Why do previous algorithms not converge?

GEVD

The conversion from Trace Ratio to Ratio Trace induces an inconsistency among the objectives of different dimensions!

Disagreement between the Objective and the Optimization Process


What will we do? from Trace Ratio to Trace Difference

Objective:

Trace Ratio

Trace Difference

Define

Find

So that

Then

from Trace Ratio to Trace Difference


What will we do? from Trace Ratio to Trace Difference

Thus

Constraint

Let

The Objective rises monotonously!

Where

are the leading

eigen vectors of .

Projection matrices of different dimensions share the same objective

We have

from Trace Ratio to Trace Difference


Highlights of our algorithm

  • The objective value is guaranteed to monotonously increase; and the multiple projection matrices are proved to converge.

  • Only eigenvalue decomposition method is applied for iterative optimization, which makes the algorithm extremely efficient.

  • The algorithm does not suffer from the singularity problem that is often encountered by the traditional generalized eigenvalue decomposition method used to solve the ratio trace optimization problem.

  • Enhanced potential classification capability of the derived low-dimensional representation from the subspace learning algorithms.

Hightlights of the Trace Ratio based algorithm


Experimental Results

The traditional ratio trace based procedure does not converge, while our new solution procedure guarantees the monotonous increase of the objective function value and commonly our new procedure will converge after about 4-10 iterations.Moreover, the final converged value of the objective function from our new procedure is much larger than the value of the objective function for any iteration of the ratio trace based procedure.

Monotony of the Objective


Experimental Results

The projection matricesconverge after 4-10 iterations for our new solution procedure;

while for the traditional procedure, heavy oscillationsexist and the solution does not converge.

Convergency of the Projection Matrices


Experimental Results

1. TMFA TR mostly outperforms all the other methods concerned in this work, with only one exception for the case G5P5 on the CMU PIE database.

2. For vector-based algorithms, the trace ratio based formulation is consistently superior to the ratio trace based one for subspace learning.

3. Tensor representation has the potential to improve the classification performance for both trace ratio and ratio trace formulations of subspace learning.

Face Recognition Results


Summary

  • A novel iterative procedure was proposed to directly optimize the objective function of general subspace learning based on tensor representation.

  • The convergence of the projection matrices and the monotony property of the objective function value were proven.

  • The first work to give a convergent solution for the general tensor-based subspace learning.

Summary


Thank You!


  • Login