# A Mean Field Limit

We consider a system consisting of $N$ interacting objects. As we let the number of objects increase, we can characterize the limiting behaviour of the system.

We define our model as follows:

• The are $N$ objects moving in discrete time and each of which taking states in the finite set ${\mathcal X}$.
• We let $X_n^N(t)$ be the state of the $n$-th object at time $t$.
• We let $M_x^N(t)$ give the proportion of objects in state $x\in{\mathcal X}$ at time $t$. Specifically,
• $X^N_n$ preforms transitions that depend on its previous state and the state of the system as recorded through a vector $S^N(t) \in {\mathbb R}^d$. Specifically, there is a Transition matrix $P^N_{xy}(r)$ such that

• Further the state of the system $S^N$ evolves according to the mean state vector $M^N$ and the previous state of the system:

We now take a limit where the number of objects in the system, $N$, is sent to infinity. We make the following assumption

• We assume that $P_{xy}^N(\cdot)$ converges uniformly to some probability transition matrix $P_{xy}(\cdot)$.

The following results shows that for $M^N$ and $S^N$ a sensible limit comes from our analysis

Thrm [MF:Thrm] Provided

for probability distribution $\mu (0)$ and vector $\sigma(0)$ then, almost surely,

for each $t\in{\mathbb Z}_+$ where $\mu(t)$ and $\sigma(t)$ satisfy the equations

For the proof we will require the following elementary lemma followed by a proposition.

Prop [MF:Bern] Let $B^N$ be an integer in $\{0,1,...,N\}$ such that $\lim_{N\rightarrow\infty} B^N/N = \beta>0$. Let $Y^N_b$ $b=1,...,B$ be iid Bernoulli such that ${\mathbb E} Y^N_b = y$ and we define

with $Z^N=0$ when $B^N=0$. Then, almost surely,

Proof By a Chernoff bound, there exists $\alpha >0$ such that

for each $\epsilon >0$. Thus

By the Borel-Cantelli Lemma

But we know that $B^N/N > b$ eventually for any $b<\beta$ so we can see that (almost surely) eventually $\limsup_{N\rightarrow \infty} | Z^N-y| <\epsilon$. Since $\epsilon$ is arbitrary, it must be that

as required. $\square$

To prove the result we compare the chain $X^N(t)$ with $\tilde{X}^N_n(t)$, the chain where we replace $S^N(t)$ with its limit process $\sigma(t)$. That is,

and $X^N_n(0)=\tilde{X}^N_n(0)$. Further, we note that since our state-space is finite we can assume the elements of ${\mathcal X}$ are ordered (with ordering $x\leq y$) and we can achieve a jump in our chain $X^N_n(t)$ to $X^N_n(t+1)$ with $S^N(t)=s$ [or $\tilde{X}^N_n(t)$ to $\tilde{X}^N_n(t+1)$ with $\sigma(t)=s$) by taking a uniform $[0,1]$ random variable $U$ and setting

We will assume that the versions of $X^N$ and $\tilde{X}^N$ are achieved by this process and that we use the same uniform random variables the jumps of $X^N_n(t)$ and $X^N_n(t+1)$ respectively.

We now prove the convergence of $\tilde{X}$:

Prop [MF:Prop] Provided $M^N(0) \rightarrow \mu(0)$ for probability distribution $\mu(0)$ then, almost surely, $\tilde{M}^N(t) \rightarrow \mu(t)$ for each $t\in{\mathbb Z}_+$ where $\mu(t)$ satisfies the equation $\mu(t+1) = \mu(t) P(\sigma(t)).$

Proof. We prove the result by induction on $t$, where we assume that the convergence result holdsup until time $t$. The result trivally holds at time $0$.

Now, assuming that the result holds until time $t$. We let $I_{xy}^n$ be the indicator function if the $n$-th object $X^N_n$ performs a jump from $x$ to $y$ at step $t$.

Focusing on each term

We see that it counts up $\tilde{M}^N_x(t)$ Bernoulli random variables and so Lemma [MF:Bern] applies and

provided $\lim_{N\rightarrow\infty}\tilde{M}^N_x(t) >0$. If $\lim_{N\rightarrow\infty}\tilde{M}^N_x(t) >0$ the above term, which is bounded above by $\tilde{M}^N_n(t)$, must converge to zero. $\square$

We now prove the main result. The main idea is to show, by induction, that $\tilde{X}^N$ and $X^N$ exhibit the same behaviour as $N$ goes to infinity.

Proof [Proof of Theorem [MF:Thrm]] One could think of this as an extension of Proposition [MF:Prop]. We preform induction on $t$ with the hypothesis that

The hypothesis trivially holds for $t=0$. Assuming that the induction hypothesis holds until time $t$, we show that the result holds for time $t+1$. Note that

where

Since the final term above converges to zero, it is sufficient to prove each $A_x^N$ converges to zero. Note that both the random variables $X^N_n(t)$ and $\tilde{X}^N_n(t)$ respectively are governed by the same condition with $s$ equal to $S^N(t)$ and $\sigma(t)$ respectively. Letting $L_{xy}$ and $V_{xy}$ respectively equal the minimum and minimum of \

We see that $X^N_n(t)$ and $\tilde{X}^N_n(t)$ are not equal when

We let $G^N(z)$ be the empircal distribution of our uniform random variables, that is

We note that

$\sup_z | G^N(z) -z |$ converges to zero by the Glivenko-Cantelli Theorem. Thus holds at time $t+1$.

At this point we see the $M^N_n(t+1)\rightarrow \mu(t+1)$ since holds at time $t+1$ and since $\tilde{M}^N_n(t)\rightarrow\mu(t)$. Further by continuity of $g(\cdot,\cdot)$ and the induction hypothesis, we have that

This proves the induction hypothesis and the theorem. $\square$