# Chapter 10 Further Topics

(not included)

## 10.1 Moment Generating Function (MGF)

If we define a function $$g(X)=X^r$$ of r.v. X, the expected value $$E[g(X)]$$ is called the rth moment about the origin.

$E[X^r] = \sum_x x^r f(x)$

$E[X^r] = \int_x x^r f(x) dx$

The first moment gives us the expectation $$E[X^1]$$. With the second moment $$E[X^2]$$ we can calculate the variance $$V(X) = E[X^2] - E[X]^2$$.

The moment generating functon $$M_X(t)$$ is defined as follows.

$M_X(t) = E[e^{tX}] = \sum_x e^{tx} f(x)$

$M_X(t) = E[e^{tX}] = \int_x e^{tx} f(x) dx$

If the sum or interval above converges, then MGF exists. If MGF exists then all moments can be calculated using the following derivative.

$\dfrac{d^rM_X(t)}{dt^r} = E[X^r], at\ t=0$

For instance, the MGF of binomial distribution is $$M_X(t) = \sum_0^n e^{tx} \binom{n}{x}p^xq^{n-x}$$.

## 10.2 Covariance

We know about the variance ($$V(X) = \sigma_x^2$$ = E[(X-E[X])^2]). But what about the variance of two random variables? Then we talk about the covariance of the joint distribution ($$V(X,Y) = E[(X-E[X])(Y-E[Y])]$$) or ($$E[XY] - E[X]E[Y]$$).

## 10.3 Correlation

Simply put, it is the magnitude of (linear) relationship between random processes X and Y. Correlation coefficient can be found by using covariance and variances of the marginal distributions. ($$\dfrac{\sigma_{XY}}{\sigma_X\sigma_Y}$$).

Correlation is frequently used to indicate the similarity between two processes. Though, there is a popular saying that ‘correlation does not imply causation’, meaning seemingly correlated processes might actually be independent. Ask your instructor (or Google) about ‘spurious correlations’.