Progress meeting m3a presentation of td3
This presentation is the property of its rightful owner.
Sponsored Links
1 / 36

Progress Meeting M3A Presentation of TD3 PowerPoint PPT Presentation


  • 74 Views
  • Uploaded on
  • Presentation posted in: General

Progress Meeting M3A Presentation of TD3. Selection Procedure. Prototype selection is based on a 3-step procedure. TD2. Qualitative pre-screening of algorithms. Quantitative evaluation of algorithms. Final ranking and prototype selection. Selected prototype. Selection Criteria.

Download Presentation

Progress Meeting M3A Presentation of TD3

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Progress meeting m3a presentation of td3

Progress Meeting M3APresentation of TD3


Selection procedure

Selection Procedure

Prototype selection is based on a 3-step procedure

TD2

Qualitative pre-screening

of algorithms

Quantitative evaluation

of algorithms

Final ranking and

prototype selection

Selected prototype


Selection criteria

Selection Criteria

8 classes of parameters are considered:

  • Scientific Background and Technical Soundness

  • The selected algorithms should be based on a solid theoretical background that guarantees the accuracy of its results also at an operational level. The guidelines for rating are as follows:

    • The methodology is solid;

    • The methodology is technical convincing;

    • The methodology is at the state-of-the-art;

    • The methodology is published in high quality journals;

    • The methodology is included in several other scientific publications or project technical reports.


Selection criteria1

Selection Criteria

  • Robustness and Generality

    • The method is suitable to be used with different kind of images;

    • The method shows high performances on different images and different test areas;

    • There are software implementations or examples for the implementation available;

    • The algorithm can be used in combination with other methodologies.


Selection criteria2

Selection Criteria

  • Novelty

  • In order to get a high score, an algorithm should have been published or reported for the first time relatively recently in the literature.The guidelines for rating the novelty are:

    • The publications are after 2003 and introduce a novel, convincing and adequately tested solution to an existing problem;

    • The publications in remote sensing are after 1998;

    • The method is not implemented in commercial SW packages.


Selection criteria3

Selection Criteria

  • Operational Requirements

  • Operational requirements are evaluated in terms of computational complexity, time effort, cost etc. The guidelines for rating of operational perspectives are as follows:

    • The requested modifications to KIM architecture are few;

    • The algorithm works fast (e.g., near real time);

    • The processing time scaling is likely to be linear with image size;

    • The hardware and disk-storage requirements are low.


Selection criteria4

Selection Criteria

  • Accuracy

  • Both absolute and relative accuracy in all operative conditions will be evaluated. The guidelines for rating the accuracy are:

    • The algorithm matches the end-user requirements and can be optimized according to them;

    • The accuracy does not depend on the availability/amount of prior information.


Selection criteria5

Selection Criteria

  • Range of Applications

  • The number and kinds of applications that an algorithm can address is evaluated:

    • The algorithm is suitable for a high number of application areas;

    • The algorithm has a high number of estimated final users for the application areas;

    • The algorithm has a high impact on the considered application areas.


Selection criteria6

Selection Criteria

  • Level of Automation

  • From an operational point of view, it is preferable that an algorithm is able to run in a completely automatic way. The main guidelines for rating of the perspectives for automation are:

    • The number of parameters to be defined by the operator is low;

    • The physical meaning of parameters is clear;

    • The method is automatic;

    • Ground truth or prior information is not requested.


Selection criteria7

Selection Criteria

  • Specific end-users requirements

  • From an operational point of view, capability of an algorithm to satisfy and meet different possible end-users requirements is an important parameter of evaluation. The main guidelines for driving this ranking are:

    • The algorithm is flexible in meeting different possible accuracy requirements;

    • The algorithm can be reasonably included in an operational procedure.


Selection procedure1

Selection Procedure

  • Step 1:Qualitative pre-screening of algorithms

    • A pre-screening of the algorithms identified and described in TD2 is carried out in order to identify the most relevant methodologies with respect to the IIM-TS project objectives.

    • The preliminary qualitative evaluationis driven from the same selection criteria used also in the next quantitative steps. In this step a high level analysis of these criteria is conducted in order to identify techniques that clearly cannot reach a satisfactory ranking on several categories of parameters.

    • These techniques are discarded and not further considered in the next steps.


Pre screening of algorithms

Pre-screening of algorithms

Binary Change Detection

Multispectral data


Pre screening of algorithms1

Pre-screening of algorithms

Binary Change Detection

SAR and Polarimetric SAR data


Pre screening of algorithms2

Pre-screening of algorithms

Binary Change Detection

Multisensor data


Pre screening of algorithms3

Pre-screening of algorithms

Multiclass Change Detection


Pre screening of algorithms4

Pre-screening of algorithms

Shape Change Detection


Pre screening of algorithms5

Pre-screening of algorithms

Trend Analysis of Temporal Series of Images

Pixel-based techniques


Pre screening of algorithms6

Pre-screening of algorithms

Trend Analysis of Temporal Series of Images

Context-based techniques


Pre screening of algorithms7

Pre-screening of algorithms

Pre-processing Multispectral Data


Pre screening of algorithms8

Pre-screening of algorithms

Pre-processing SAR Data


Pre screening of algorithms9

Pre-screening of algorithms

Pre-processing SAR Data


Pre screening of algorithms10

Pre-screening of algorithms

Pre-processing Multisensor Data


Selection procedure2

Selection Procedure

  • Step 2: Quantitative evaluation of algorithms

    • Algorithms that pass the pre-screening in step 1 are analyzed in greater detail with a quantitative evaluation.

    • This analysis is based on different parameters (scientific and technical analysis, possible impacts on the application and end-users, etc).

    • For each algorithm (or cluster of algorithms) a method sheet is filled in, which reports details of the algorithm and individual scores for each parameter considered.


Method sheets organization

Method Sheets Organization

Algorithm characteristics


Method sheets organization1

Method Sheets Organization

Evaluation


Method sheets organization2

Method Sheets Organization

Evaluation


Selection procedure3

Selection Procedure

  • Step 3: Final ranking and prototype selection

    • According to an analysis of methods sheets a final score is given to each algorithm and method.

    • This value is used for ranking algorithms according to their relevance with respect to IIM-TS objectives;

    • The algorithms to be prototyped are identified on the basis of the score and of a final discussion of the ranking.


Total score computation

Total Score Computation

Total score computation

  • 1 point is given to each considered class of parameters for each positive answer in the corresponding category of the method sheet. Then the category score is normalized.

  • Few points are assigned to each method according to the number of citations per year of the algorithms in scientific papers (or in technical reports) following this table:


Total score computation1

Total Score Computation

The score achieved for each single class is properly weighted in order to take into account its relevance with respect to the goals of the project. The following equation is used:

The final score indicates the relevance of the method with respect to the prototyping procedure within IIM-TS project.


Total score computation2

Total Score Computation

wn (n = 1,…9) is the weight assigned to the n-th category of criteria, and represents the relative relevance of the considered criterion with respect to the others:


Table of ranking

Table of Ranking


Table of ranking1

Table of Ranking


Design of the architecture

Design of the Architecture

  • The selection of the prototype algorithms among those with the highest scores in the ranking should be finalized taking into account the possible synergy between different techniques.

  • The final selection should be also based on an adequate balancing among techniques belonging to the different classes.


Design of the architecture1

Difference

Ratio

Information theoretically similatity

Measures (KL divergence, etc.)

Polarimetric change indeces

(correlation, etc.)

Shape Measures

Comparisons

Thresholding based on

the Bayes decision theory

Context-based approaches

Multiscale approaches

Multimodal approaches

Multivatiarte Alteration Detection

Neural networks

Satellite linear-based index

Fourier and Wavelet Analysis

Spatio-temporal clustering

Statistical analysis of areas of interest

(by GIS or object analysis)

Shape Change Detection

Multiclass Change Detection

Binary Change Detection

Trend Analysis

Design of the Architecture

Raw SAR images

Raw optical images

Focusing

Geometric corrections

Radiometric corrections

Radiometric normalization

Mutitemporal filtering

Mosaiking Segmentation

Time varying segmentation

Registration

Ortho-rectification

Mosaiking

Radiometric corrections

Cloud detection

Topographic corrections

Pan-sharpening

Image filtering

Feature extraction

Pre-processing

Pre-processing

Post-classification Comparison

Direct-Multidate Classification

Compound Classification

Unsupervised approaches

Multisensor techniques


Design of the architecture2

Design of the Architecture

  • Pre-processing chain for multispectral images (geometric corrections and radiometric corrections)

  • Pre-processing chain for SAR data (geometric corrections and radiometric corrections)

  • Binary change detection:

    • Set of measures for image comparison (difference, magnitude of the difference vector, ratio, log-ratio, KL, similarity measures)

    • Image splitting

    • Bayesian framework for the analysis of the results of the comparison (minim error and cost decision rules, Gaussian model, Generalize Gaussian model (?), MRF context-sensitive decision, manual or automatic initialization?)


Design of the architecture3

Design of the Architecture

  • Multiclass change detection:

    • Unsupervised method based on autochange algorithm

    • Supervised methods based on MDC and PCC (need for a distribution-free classification module)

    • Rule based multisensor classifier

  • Trend analysis of time series:

    • Spatio-temporal clustering (data mining)

    • Tools for FT and WT

    • Hot spot monitoring via GIS fusion

  • Shape change detection measure


  • Login