The
normal distribution is the most commonly used probability distribution
in statistics. Many other probability distributions are related
to this distribution. As the number of random variables
increases, the distribution becomes a bell shaped curve. This
curve is called the normal curve or Gaussian curve (in honor of the
German mathematician Karl Friedrick Gauss, 1777-1855). The normal
distribution is defined with mean and standard deviation.
The following example shows input and output from 3 simulations.
Each has the same mean (50) with different standard deviation, 5, 10,
and 30 respectively. All three simulations have 50,000 iterations
and alpha of 5% (for 1 tail test).
The
output shows the estimate of skewness, mean, stand deviation, maximum
value, minimum value, lower confidence interval, and upper confidence
interval from each of the 3 simulations. The skew levels from
each of the simulated distributions are closed to zero. That is
because the distributions are normal. Notice that both the
maximum values and the minumum values are approximately equal to mean x
4 standard deviations and mean x -4 standard deviations
respectively.