Loading in 5 sec....

Low-Dimensional Chaotic Signal Characterization Using Approximate EntropyPowerPoint Presentation

Low-Dimensional Chaotic Signal Characterization Using Approximate Entropy

Download Presentation

Low-Dimensional Chaotic Signal Characterization Using Approximate Entropy

Loading in 2 Seconds...

- 266 Views
- Uploaded on
- Presentation posted in: General

Download Presentation
## PowerPoint Slideshow about 'Low-Dimensional Chaotic Signal Characterization Using Approximate Entropy ' - albert

**An Image/Link below is provided (as is) to download presentation**

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

### Low-Dimensional Chaotic Signal Characterization Using Approximate Entropy

Soundararajan Ezekiel

Matthew Lang

Computer Science Department

Indiana University of Pennsylvania

Roadmap Approximate Entropy

- Overview
- Introduction
- Basics and Background
- Methodology
- Experimental Results
- Conclusion

Overview Approximate Entropy

- Many signals appear to be random
- May be chaotic or fractal in nature
- Wary of noisy systems
- Analysis of chaotic properties is in order
- Our method - approximate entropy

Introduction Approximate Entropy

- Chaotic behavior is a lack of periodicity
- Historically, non-periodicity implied randomness
- Today, we know this behavior may be chaotic or fractal in nature
- Power of fractal and chaos analysis

Introduction Approximate Entropy

- Chaotic systems have four essential characteristics:
- deterministic system
- sensitive to initial conditions
- unpredictable behavior
- values depend on attractors

Introduction Approximate Entropy

- Attractor's dimension is useful and good starting point
- Even an incomplete description is useful

Basics and Background Approximate Entropy

- Fractal analysis
- Fractal dimension defined for set whose Hausdorff-Besicovitch dimension exceeds its topological dimensions.
- Also can be described by self-similarity property
- Goal: find self-similar features and characterize data set

Basics and Background Approximate Entropy

- Chaotic analysis
- Output of system mimics random behavior
- Goal: determine mathematical form of process
- Performed by transforming data to a phase space

Basics and Background Approximate Entropy

- Definitions
- Phase Space: n dimensional space, n is number of dynamical variables
- Attractor: finite set formed by values of variables
- Strange Attractors: an attractor that is fractal in nature

Basics and Background Approximate Entropy

- Analysis of phase space
- Determine topological properties
- visual analysis
- capacity, correlation, information dimension
- approximate entropy
- Lyapunov exponents

Basics and Background Approximate Entropy

- Fractal dimension of the attractor
- Related to number of independent variables needed to generate time series
- number of independent variables is smallest integer greater than fractal dimension of attractor

Basics and Background Approximate Entropy

- Box Dimension
- Estimator for fractal dimension
- Measure of the geometric aspect of the signal on the attractor
- Count of boxes covering attractor

Basics and Background Approximate Entropy

- Information dimension
- Similar to box dimension
- Accounts for frequency of visitation
- Based on point weighting - measures rate of change of information content

Methodology Approximate Entropy

- Approximate Entropy is based on information dimension
- Embedded in lower dimensions
- Computation is similar to that of correlation dimension

Algorithm Approximate Entropy

- Given a signal {Si}, calculate the approximate entropy for {Si} by the following steps. Note that the approximate entropy may be calculated for the entire signal, or the entropy spectrum may be calculated for windows {Wi} on {Si}. If the entropy of the entire signal is being calculated consider {Wi} = {Si}.

Algorithm Approximate Entropy

- Step 1: Truncate the peaks of {Wi}. During the digitization of analog signals, some unnecessary values may be generated by the monitoring equipment.
- Step 2: Calculate the mean and standard deviation (Sd) for {Wi} and compute the tolerance limit R equal to 0.3 * Sd to reduces the noise effect.

Algorithm Approximate Entropy

- Step 3: Construct the phase space by plotting {Wi} vs. {Wi+τ}, where τ is the time lag, in an E = 2 space.
- Step 4: Calculate the Euclidean distance Di between each pair of points in the phase space. Count Ci(R) the number of pairs in which Di<R, for each i.

Algorithm Approximate Entropy

- Step 5: Calculate the mean of Ci(R) then the log (mean) is the approximate entropy Apn(E) for Euclidean dimension E = 2.
- Step 6: Repeat Steps 2-5 for E = 3.
- Step 7: The approximate entropy for {Wi} is calculated as Apn(2) - Apn(3).

Noise Approximate Entropy

HRV (young subject) Approximate Entropy

HRV (older subject) Approximate Entropy

Stock Signal Approximate Entropy

Seismic Signal Approximate Entropy

Seismic Signal Approximate Entropy

Conclusion Approximate Entropy

- High approximate entropy - randomness
- Low approximate entropy - periodic
- Approximate entropy can be used to evaluate the predictability of a signal
- Low predictability - random