information & communication inst 4200 david j stucki spring 2015
TRANSCRIPT
![Page 1: Information & Communication INST 4200 David J Stucki Spring 2015](https://reader031.vdocuments.net/reader031/viewer/2022020319/56649f1e5503460f94c35934/html5/thumbnails/1.jpg)
Information & Communication
INST 4200David J StuckiSpring 2015
![Page 2: Information & Communication INST 4200 David J Stucki Spring 2015](https://reader031.vdocuments.net/reader031/viewer/2022020319/56649f1e5503460f94c35934/html5/thumbnails/2.jpg)
2
Languages• Natural Language (Human)• Roughly known to exist
![Page 3: Information & Communication INST 4200 David J Stucki Spring 2015](https://reader031.vdocuments.net/reader031/viewer/2022020319/56649f1e5503460f94c35934/html5/thumbnails/3.jpg)
3
Languages• Natural Language (Human)• Roughly 7000 known to exist• Fluency vs. Literacy• Spoken language is spontaneously acquired• Written language must be intentionally acquired
• Artificial Language (Computer)• More than in existence
![Page 4: Information & Communication INST 4200 David J Stucki Spring 2015](https://reader031.vdocuments.net/reader031/viewer/2022020319/56649f1e5503460f94c35934/html5/thumbnails/4.jpg)
4
Languages• Natural Language (Human)• Roughly 7000 known to exist• Fluency vs. Literacy• Spoken language is spontaneously acquired• Written language must be intentionally acquired
• Artificial Language (Computer)• More than 2500 in existence• Types:• Programming: Encoding algorithms & processes• Mark-up: Encoding documents• Protocol: Encoding communication mechanisms
• Machine Language (native/hard-wired)
![Page 5: Information & Communication INST 4200 David J Stucki Spring 2015](https://reader031.vdocuments.net/reader031/viewer/2022020319/56649f1e5503460f94c35934/html5/thumbnails/5.jpg)
Binary• 0 and 1 are the universal alphabet of all digital electronics• Numbers (including sign (+/-), decimal point, etc.)• Text (including all natural language alphabets)• Images• Sounds• Video• Programs• Everything else!!!
• So what?
![Page 6: Information & Communication INST 4200 David J Stucki Spring 2015](https://reader031.vdocuments.net/reader031/viewer/2022020319/56649f1e5503460f94c35934/html5/thumbnails/6.jpg)
Claude Elwood Shannon: April 30, 1916 - February 24, 2001
Shannon (1948) The Mathematical Theory of Communication
![Page 7: Information & Communication INST 4200 David J Stucki Spring 2015](https://reader031.vdocuments.net/reader031/viewer/2022020319/56649f1e5503460f94c35934/html5/thumbnails/7.jpg)
Information Theory
In 1948, Bell Labs scientist Claude Shannon developed Information Theory, and the world of communications
technology has never been the same.
![Page 8: Information & Communication INST 4200 David J Stucki Spring 2015](https://reader031.vdocuments.net/reader031/viewer/2022020319/56649f1e5503460f94c35934/html5/thumbnails/8.jpg)
Information Theory• Two issues:1. How do we represent analog data in a digital
system?Modeling & Sampling techniquesCompression issues
2. How do we reliably transmit digital data over an analog channel?
Error recoveryBandwidth issues
![Page 9: Information & Communication INST 4200 David J Stucki Spring 2015](https://reader031.vdocuments.net/reader031/viewer/2022020319/56649f1e5503460f94c35934/html5/thumbnails/9.jpg)
Example of Lossy Compression
![Page 10: Information & Communication INST 4200 David J Stucki Spring 2015](https://reader031.vdocuments.net/reader031/viewer/2022020319/56649f1e5503460f94c35934/html5/thumbnails/10.jpg)
Entropy• Definition: lack of order or predictability (complexity)• While not the familiar definition from thermodynamics, it is
closely related, and can be transformed mathematically into an equivalent form
• The complexity of a string of symbols can be measured in terms of the length of the smallest program that will generate it.• The interesting consequence of this for both computer
science and machine intelligence is that both highly ordered or predictable strings and completely random strings have low entropy, whereas high entropy lies in the middle, on the border of chaos.