Damian Niwinski
Information theory
Lecture: Tuesday 10:15-12, room 3160.
Lecture Notes On-Line
ps,
pdf, updated
17.05.2006.
Comments welcome!
Tutorials:
Tuesday 12:15-14, room 2100,
Michal Strojnowski.
Friday 8:30-10, room 1030,
Hugo Gimbert.
Tutorials web page
Exam
Zadania zaliczeniowe
new series!
Objectives:
Introduction into a theory which is useful in many application of informatics,
like cryptography, modeling of natural language, and bio-informatics.
The theory defines quantitative measures of information contained in
a random variable or in a sequence of bits.
It also provides criteria of optimal
compressing (coding) of information, and of sending a message through an insecure channel.
Plan:
1. From the 20 questions game to the concept of entropy. Kraft inequality.
Codes of Huffman and Shannon-Fano.
2. Conditional entropy, information.
3. The First Shannon Theorem about optimal encoding.
4. Channels, information lost, improving efficiency, channel capacity.
5. The Shannon Theorem about sending information through a noisy channel.
6. Information complexity by Kolmogorov. Chaitin number.
7. Kolmogorov's complexity vs. Shannon's entropy---universal test by Martin Lof.
Literature (suggestions welcome!).
Previous edition:
Teoria informacji 2004.