1 / 15

Estimating Kolmogorov Entropy from Acoustic Attractors from a Recognition Perspective

Estimating Kolmogorov Entropy from Acoustic Attractors from a Recognition Perspective. Saurabh Prasad Intelligent Electronic Systems Human and Systems Engineering Department of Electrical and Computer Engineering. Estimating the correlation integral from a time series.

yen
Download Presentation

Estimating Kolmogorov Entropy from Acoustic Attractors from a Recognition Perspective

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Estimating Kolmogorov Entropy from Acoustic Attractors from a Recognition Perspective Saurabh Prasad Intelligent Electronic Systems Human and Systems Engineering Department of Electrical and Computer Engineering

  2. Estimating the correlation integral from a time series Correlation Integral of an attractor’s trajectory : Correlation sum of a system’s attractor is a measure quantifying the average number of neighbors in a neighborhood of radius along the trajectory. where represents the i’th point on the trajectory, is a valid norm and is the Heaviside’s unit step function (serving as a count function here) At a given embedding dimension (m > [2*D+1]), we have: ~ Fractal Dimension of the attractor

  3. order-q Renyi entropy and K2-Entropy Divide the state space into disjoint boxes If the evolution of the state space that generated the observable is sampled at Represents the joint probability that lies in box i1 lies in box i2 and so on. Numerically, the Kolmogorov entropy can be estimated as the second order Renyi entropy (K2)

  4. Second Order Kolmogorov Entropy Estimation of speech data • Speech data, sampled at 22.5 KHz • Sustained Phones (/aa/, /ae/, /eh/, /sh/, /z/, /f/, /m/, /n/) • Output – Second order Kolmogorov Entropy • We wish to analyze: • The presence or absence of chaos in any time series. • Their discrimination characteristics across attractors from different sound units (for classification)

  5. The analysis setup • Currently, this analysis includes estimates of K2 for different embedding dimensions • Variation in entropy estimates with the neighborhood radius, epsilon was studied • Variation in entropy estimates with SNR of the signal was studied • Currently, the analysis was performed on 3 vowels, 2 nasals and 2 fricatives • Results show that vowels and nasals have a much smaller entropy, as compared to fricatives • K2 consistently decreases with embedding dimension for vowels and nasals, while for fricatives, it consistently increases

  6. The analysis setup (in progress / coming soon)… • Data size (length of the time series): • This is crucial for our purpose, since we wish to extract information from short time series (sample data from utterances). • Speaker variation: • We wish to study variations in the Kolmogorov entropy of phone or word level attractors • across different speakers. • across different phones/words • across different broad phone classes

  7. Correlation Entropy vs. Embedding Dimension Various Epsilons

  8. Correlation Entropy vs. Embedding Dimension Various Epsilons

  9. Correlation Entropy vs. Embedding Dimension Various Epsilons

  10. Correlation Entropy vs. Embedding Dimension Various SNRs

  11. Correlation Entropy vs. Embedding Dimension Various Data Lengths

  12. Measuring Discrimination Information in K2 based features Kullback-Leibler (KL) divergence: Provides an information theoretic distance measure between two statistical models Average Discriminating Information between class i and class j: Likelihood: i vs. j Likelihood: j vs. i For Normal Densities:

  13. Measuring Discrimination Information in K2 based features Statistics of entropy estimates over several frames, for various phones

  14. Measuring Discrimination Information in K2 based features KL-Divergence Measure between K2 features from various phonemes for two speakers

  15. Plans • Finish studying the use of K2 entropy as a feature characterizing phone-level attractors • We will be performing a similar analysis on Lyapunov Exponents and Correlation Dimension estimates • Measure speaker dependence in this invariant • Use this setup on a meaningful recognition task • Noise robustness, parameter tweaking, integrating these features to MFCCs • Statistical Modeling…

More Related