Normal distributions

The distribution of a sum of independent normally distributed random variables also follows a normal distribution. This is a rather particular result for normally distributed variables; see here for a detailed proof of this result. In particular, consider two independent random variables X and Y, where XN(μX,σX2) and YN(μY,σY2) (i.e., means μX and μY and standard deviations σX and σY). Then, their sum Z=X+Y is also normally distributed, where ZN(μX+μY,σX2+σY2). The most straightforward proof of this is the geometric one (see link above).

Moving from the probability density to the cumulative distribution function. Briefly (see here for more detailed information), the probability density of the normal distribution is f(x|μ,σ2)=12πσexp[(xμ)2σ2] This probability density tells you that if you want to know the probability that a random sample XN(μ,σ2) is between the values x0<x1, it is given by P(x0Xx1)=x0x1f(t|μ,σ2)dt If we take the special case where x0, and then we have the cumulative distribution function, P(Xx)=xf(t|μ,σ2)dt=12[1+erf(xμσ2)] where erf is the Error function. The cumulative distribution function goes to 0 as x and to 1 as x.