1 / 18

Information Theory

Information Theory. information. the non-redundant portion of any string of numbers .333333333333 . . ., holds very little information, since the 3s can be expressed more succinctly as a fraction.

uriel
Download Presentation

Information Theory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Information Theory

  2. information • the non-redundant portion of any string of numbers • .333333333333 . . ., holds very little information, since the 3s can be expressed more succinctly as a fraction. • π, or 3.141592653589793 . . ., with its apparently endless non-patterned and non-repeating sequence of numbers, contains a very high level of information (often called entropy).

  3. entropy • Information measures the amount of non-redundant information in data. • Entropy measures the amount of improbability or unpredictability in data.

  4. Preference or aesthetic value • High information is no better or worse than low information (entropy).

  5. Example • language is highly redundant • J-st tr- t- r--d th-s s-nt-nc-. • The meaning of a message can remain unchanged even though parts of it are removed.

  6. Definition • Information theory is a branch of communications theory that deals with the amount and accuracy of information when transmitted from a source, through a medium, to a destination.

  7. Claude Shannon (1949) • (1) Zero-order symbol approximation as in ZZFYNN PQZF LQN, where errors and correct symbols exist side by side; • (2) First-order symbol approximation (including letter frequency), as in ID AHE RENI MEAT, where letter frequency is taken into consideration as well as which letter likely follows which letter; • (3) Second-order symbol approximation, as in RENE ID AHA MIET, where letter ordering two steps back is taken into consideration; • (4) Third-order symbol approximation as in HE ARE ID TI NEAM, where letter ordering three steps back is taken into consideration; • (5) First-order word approximation as in I DARE HE IN TAME, where letters and now word existence are taken into consideration; • (6) Second-order word approximation as in I DARE HE NAME IT, where letters, word existence, and probable word order are taken into consideration;

  8. Algorithmic information theory • (AIT), a branch of information theory, concentrates less on the communication accuracy of information, and more on the precise amount of non-compressible information contained in a message.

  9. Gregory Chaitin • the complexity of something to be the size of the simplest theory for it, in other words, the size of the smallest program for calculating it. This is the central idea of algorithmic information theory (AIT), a field of theoretical computer science. Using the mathematical concept of program-size complexity, we exhibit irreducible mathematical facts, mathematical facts that cannot be demonstrated using any mathematical theory simpler than they are.

  10. Compression • An important component of AIT is that the files compressed must be "lossless," that is, restorable (by reverse processing) to their precise original form.

  11. Musical Algorithmic Information Theory • MIDI (Musical Instrument Digital Interface) presents an interesting example of compression • MIDI produces accurate results with a fraction of the storage space required. • MIDI also loses a significant amount of information due to its score representation approach

  12. Better example • The information content of a musical work—the part that cannot be further reduced—consists of material that does not repeat sufficiently, exactly, or in variation, such that a symbol can replace it to compress that passage.

  13. Aesthetics again • low information content music is more developed and well formed, while higher information content music is more erratic and disorganized??????? • better music has lower information content????????? • Not necessarily.

  14. Examples • information contents of various segments of works by J. S. Bach (0.51), W. Mozart (0.45), Ludwig Beethoven (0.50), Johannes Brahms (0.64), Anton Webern (0.88), Ernst Krenek (0.87), and David Cope (0.63) using a very simple compression scheme demonstrate interesting results

  15. Dynamic Musical AIT (DMAIT). • In music, the ear shifts from one musical parameter to another. • Melody, harmony, dynamics, timbre, rhythm, and so on, all jostle for attention with each succeeding at one time or another. • The mind’s ear tends to go where the most information occurs.

  16. How it works • Dynamic Musical AIT systematically refigures Musical AIT for several musical parameters simultaneously. • DMAIT fingerprints help identify the important elements of music at different times. • To avoid flatlining in longer works, a moving DMAIT aperture of several beats preceding the already calculated information contentavoids re-computing information from the very beginning of a work.

  17. Bartók's Mikrokosmos No. 81

  18. A DMAIT graph of previous

More Related