Loading in 2 Seconds...
Loading in 2 Seconds...
Sensor Networks, Rate Distortion Codes, and Spin Glasses. NTT Communication Science Laboratories Tatsuto Murayama email@example.com In collaboration with Peter Davis March 7 th , 2008 at the Chinese Academy of Sciences. Problem Statement. Sensor Networks. Sensor
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
NTT Communication Science Laboratories
In collaboration with Peter Davis
March 7th, 2008 at the Chinese Academy of Sciences
Sensors transmit their noisy observations independently.
Computer estimates the quantity of interest from sensor information.
Network has a limited bandwidth constraint.
《Supply Side Economics》Semiconductors are going to be very small and also cheap, so they’d like to sell them a lot!
Large-scale information integration
Network Capacity is limited
Information loss via sensing
Information loss via communications
High Noise RegionNetwork is going to be large and dense!
Finite Network CapacityEfficient use of the given bandwidth is required!
Need a new information integration theory!
Saturate Strategy (SS)
Transmit as much sensor information as possiblewithout data compression.
Which strategy is outperforming?
A small quantity of high quality statistics
A large quantity of low quality statistics
Large System Strategy (LSS)
Transmit the overwhelming majority of compressed sensor information.
Strategic Transition Point.
In case of the `Ising’ alphabet
Assume purely random Source is observed
Independent decoding process is forced
Bitwise majority vote is concerned
2 messages saturate network.
Cost of comm.= # of sensors ( bits of info.)
Moderate aggregation levels are possible.
Still 2 messages saturate network.
Cost of comm.＝ # of sensors data rate
We can make system as large as we want!
we have the following result.
What is the best bound for the lossy compression?
Best bound is the rate distortion function.
Non-trivial regions are feasible
The CEO can be informed!
Does LSS have any advantage over SS?
Which is outperforming?
LSS is outperforming when measure is negative.
SS is outperforming when measure is positive.
Existence of comparative advantage gives a strong motivation for making large systems.
Random Walk Statistics
Isolated Model Reduces to Random Walk Statistics.
Large system strategy is not so outperforming
where the fidelity criterion:
Microscopic consistency might induce the macroscopic order of the frustrated system.
Saddle Point of Free Energy
Frustrated model reduces to spin glass statistics.
for Replica Solution
Similar to the case of optimal random coding.
Coincides with optimal random coding.