If we have a random continuous variable, we can talk about its distribution. A random distribution describes how often a random value will appear between with an interval. The area under the distribution curve within an interval is the probability that a value will occur within that interval. The area underneath the entire curve of a distribution function must be equal to one. Two common distributions are shown below: the Gaussian density function and the uniform density function.
These graphs show the density graphs for a gaussian variable and a uniform random variable.
The distribution function of a random variable gives the chance that a random value is lower than some number. It graphs the integral of the density function from negative infinity to x. The density function is equal to the slope of the distribution function. The distribution functions for Gaussian and uniform random variables are also shown below. All distribution functions have a range from zero to one and are nondecreasing functions.
These graphs show the distribution graphs for a gaussian variable and a uniform random variable.
Random variables have an expected value - also known as the mean. The expected value is easy to calculate by taking an average of a number of samples. It can also be found by finding the point at which the distribution function is equal to 0.5.
Random variables also have a standard deviation. The standard deviation describes how far away from the mean most of the values will occur. The square of the standard deviation is called the variance. Variance is calculated by taking the mean of squares minus the squared mean.
var(x) = mean(x2) - mean(x)2
When we have more than one random variable we can talk about joint probability functions. The function D(x1,x2) is a joint probability function that gives the probability that random variables { X1<x1, X2<x2}. The volume under the joint probability density function for some range of x1 and x2 give the probability that both random values will occur within the range. We can get a good idea of what the joint probability density look like by graphing two sets of measure values.
These graphs show the joint distribution patterns for two gaussian variables and two uniform random variables respectively.
The graphs in this appendix were created using the following MATLAB commands:
x = -2:1/100:2;
u(1:100)=0;
u(100:300)=0.5;
u(300:401)=1.0;
for i=1:401 g(i) = exp(-x(i)^2/2)/sqrt(2*pi); end
up(1:100)=0;
for i=100:300 up(i) = (i-100)/200; end
up(300:401)=1;
gp = erf(x);
plot(x,g)
plot(x,u)
plot(x,gp)
plot(x,up)
rg = randn(1000,2);
ru = rand(1000,2);
plot(rg(:,1), rg(:,2), ‘o’);
plot(ru(:,1), ru(:,2), ‘o’);