## Course description

How can data be compressed with the minimum number of bits? How can an
asset be invested in the stock market with the maximum growth of
wealth? Or how can a future signal be predicted from the past with the
least amount of distortion? Over the past four decades, several
algorithms have been developed for such tasks that achieve essentially
optimal performance with no prior knowledge about the statistical
properties of the data.

This course studies theoretical foundations on such universal
algorithms for probabilistic (random data) and deterministic
(nonrandom data) settings, building on the notion of universal
probability. The main focus is on applications, including data
compression, prediction, portfolio selection, entropy estimation, and
classification. Other applications will be also explored through final
projects.

## Prerequisite

Working knowledge of probability.

## Course requirements and grading

**Class participation**: Each student is expected to take notes
and share them with the class.

**Homework**: There will be 4-5 homework assignments.

**Final project**: During the last week of the quarter, you should
present an in-depth survey of a topic not covered in class. You can
choose to be theoretical (more analysis) or practical (some
implementation). Each project will be performed in a group of one or
two students (two strongly recommended).
Suggested topics can be found here.

**Grading**: Class participation 1/4, homework 1/4, and final
project 1/2. We reserve the right to change the weights later.

## Lectures

TuTh 12:30–1:50 pm

Warren Lecture Hall 2113

## Teaching staff