We show that relative entropy decreases for continuous time Markov chains.
We consider a system consisting of interacting objects. As we let the number of objects increase, we can characterize the limiting behaviour of the system.
In the Cross Entropy Method, we wish to estimate the likelihood
Here is a random variable whose distribution is known and belongs to a parametrized family of densities . Further is often a solution to an optimization problem.
Sanov’s asks how likely is it that the empirical distribution some IIDRV’s is far from the distribution. And shows that the relative entropy determines the likelihood of being far.
Entropy and Relative Entropy occur sufficiently often in these notes to justify a (somewhat) self-contained section. We cover the discrete case which is the most intuitive.
- A heuristic look at the stochastic integral.
- heuristic derivation of Itô’s formula.
- Markov’s Inequality; Chebychev’s Inequality; Chernoff’s Bound.
- Bounds for the Poisson Distribution.