Next: B. Layout Data Formats
Up: A. Basis of the
Previous: A.1 The GAUSSIAN Normal
Subsections
Let
be a set of
independent random variates and
have an arbitrary probability distribution
with
mean
and a finite variance
.
Two variates A and B are statistically independent if the conditional
probability
(probability of an event A
assuming that B has occurred) of A given B satisfies
![$\displaystyle P(A\vert B)=P(A)$](img373.png) |
(A.15) |
in which case the probability of A and B is just
![$\displaystyle P(A B) = P(A \cap B)=P(A) P(B)$](img374.png) |
(A.16) |
Similarly, n events
are independent if
![$\displaystyle P\left(\bigcap_{i=1}^n A_i\right) = \prod^n_{i=1} P(A_i)$](img376.png) |
(A.17) |
Then the normal form variate
![$\displaystyle X_{norm}=\frac{\sum^N_{i=1} x_i - \sum^N_{i=1} \mu_i}{\sqrt{\sum^N_{i=1} \sigma_i^2}}$](img377.png) |
(A.18) |
has a limiting cumulative distribution function which approaches a normal
distribution.
Under additional conditions on the distribution of the variates, the
probability density itself is also normal with mean
and variance
. If conversion to normal form is not performed, then the
variate
![$\displaystyle X \equiv \frac{1}{N} \sum^N_{i=1} x_i$](img378.png) |
(A.19) |
is normally distributed with
and
.
Consider the inverse FOURIER transform of
.
Now write
![$\displaystyle \langle X^n \rangle = \langle N^{-n} ( x_1 + x_2 + \ldots + x_N )^n \rangle =$](img386.png) |
(A.21) |
![$\displaystyle \int_{-\infty}^{\infty} N^{-n} (x_1 + \ldots + x_N )^n P(x_1) \cdots P(x_N) dx_1 \cdots dx_N,$](img387.png) |
(A.22) |
so we have
Now expand
![$\displaystyle \ln{(1+x)} = x - \frac{1}{2} x^2 + \frac{1}{3} x^3 + \ldots,$](img397.png) |
(A.24) |
so
since
![$\displaystyle \mu_x \equiv \langle x \rangle$](img401.png) |
(A.26) |
![$\displaystyle \sigma_x^2 \equiv \langle x^2 \rangle-\langle x \rangle^2$](img402.png) |
(A.27) |
Taking the FOURIER transform
![$\displaystyle P_X \equiv \int_{-\infty}^{\infty} e^{-2 \pi \imath f x} {\cal F}^{-1}\left[P_X(f)\right] df$](img403.png) |
|
![$\displaystyle = \int_{-\infty}^{\infty} e^{2 \pi \imath f(\mu_x-x)-(2 \pi f )^2 \frac{\sigma_x^2}{2N}} df.$](img404.png) |
(A.28) |
This is of the form
![$\displaystyle \int_{-\infty}^{\infty}e^{\imath a f - b f^2} df,$](img405.png) |
(A.29) |
where
and
. This integral yields
![$\displaystyle \int_{-\infty}^{\infty} e^{\imath a f - b f^2} df = e^{-\frac{a^2}{4b}} \sqrt{\frac{\pi}{b}}$](img408.png) |
(A.30) |
Therefore
But
and
, so
![$\displaystyle P_X=\frac{1}{\sigma_X \sqrt{2 \pi}} e^{\textstyle -\frac{(\mu_X - x)^2}{2 \sigma_X^2}}$](img413.png) |
(A.32) |
The ``fuzzy'' central limit theorem says that data which are influenced by
many small and unrelated random effects are approximately normally distributed.
Next: B. Layout Data Formats
Up: A. Basis of the
Previous: A.1 The GAUSSIAN Normal
R. Minixhofer: Integrating Technology Simulation
into the Semiconductor Manufacturing Environment