Design of Statistical Investigations. 2 Background Stats. Stephen Senn. linear combinations :
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
2 Background Stats
1) If Xi is a random variable with expected value E[Xi]= i and variance V[Xi] = i2 and a and b are two constants, then E[a + bXi] = a + bi and V[a + bXi] = b2i2.
2) If Xiand Xjare two random variables, then E[aXi + bXj] = ai + bi and V[aXi + bXj] = a2i2 + b2j2 + 2abij, where ij = E[(Xi - i)(Xj - j)] is known as the covariance of Xi and Xj.
3) If X1, X2,..Xnare nindependentrandom variables, with expectations, nand variancesn, respectively, then aiXi has expectation aii and variance ai2i2.
If X1, X2, ......Xn is a random sample of size n from a population with variance 2, then
is known as the corrected sum of squares and has expected value (n - 1)2.
NB The factor (n - 1), known as the degrees of freedom, arises because the correction point (in this case the sample mean) is estimated from the data. In general we lose one degree of freedom for every constant fitted.
If a corrected sum of squares, CSS, with degrees of freedom is calculated from a random sample from a Normal distribution with variance 2, then CSS/2 has a chi-square distribution with degrees of freedom.
If Y1 has a chi-square distribution with 1 degrees of freedom and Y2 is independently distributed as a chi-square with 2 degrees of freedom then Y = Y1 + Y2 has a chi-square with 1 + 2 degrees of freedom
If Z is a random variable which is Normally distributed with mean 0 and variance 1 and Y is independently distributed as a chi-square with degrees of freedom, then
t = Z/(Y/)
has a t distribution with degrees of freedom.
The square of a t is distributed F1,
The square of a Normal (0,1) is distributed 21
The sum of a series of Normally distributed random variables is itself Normally distributed with mean and variance given by the rule for linear combinations.
The ratio of two independent random chi-square variables , each divided by its degrees of freedom is an F r.v. with corresponding degrees of freedom. (If the numerator chi-square has d.f. and the denominator has d.f. then the resulting r.v. is F , )
A model of the form
Yi = 0 + 1X1i + 2X2i + ...Xki +i
i = 1...n
where Yi is a response measured on the ith individual (for example patient i), X1i, X2i etc are measurements of linear predictors (covariates) for the ith individual and i is a stochastic disturbance term, is known as a generallinear model and may be expressed in matrix form
Y = X +
Y is an n x 1 vector of responses
X is an n x (k + 1) matrix of predictors consisting of k + 1 column vectors of length n, where the first column vector has all n elements = 1 and the next k columns represent the linear predictors X1 to Xk.
is an n x 1 vector of disturbance terms with E()= 0 usually assumed independent and of constant variance 2, so that
E2 I, where I is an n x n identity matrix.
The Ordinary Least Squares (OLS) estimator of is
and its variance (variance covariance matrix) is
If the further assumption is made that the i terms are Normally distributed then b has a multivariate Normal distribution and individual elements of b are Normally distributed with variance identifiable from (1.2).
In practice, 2will be unknown but has unbiased estimate
s2 = eTe/(n - k -1)
where e = Y - Xb and is the vector of residuals from the fitted model.
The ratio of bj to ajjs has a t-distribution with n - k - 1 degrees of freedom, where bj is the jth element of b and ajj is the jth diagonal element of A.
This fact may be used to test hypotheses about any element of and to construct confidence intervals for it.
The bivariate Normal first received extensive application in statistical analysis in the work of Francis Galton (1822-1911) who was a UCL man! These are some brief notes about some mathematical aspects of it.
If the joint probability density function of two random variables is given by
then X and Y are said to have a bivariate Normal distribution
Since (1.1) is a p.d.f then
A contour plot of a bivariate Normal with
is given on the next slide.
If we integrate out Y, we obtain the marginal distribution of X and this is, in fact a Normal with mean and variance
and similarly by integrating out X we obtain the marginal distribution of Y, which is also a Normal distribution
From (1.4) and (1.5) we see that are respectively the mean of X and Y and the variance of X and Y. The parameter is known as the correlation coefficient and was studied extensively by Galton.
We are often interested in the conditional distribution of Y given X and vice versa. These also turn out to be Normal distributions. In fact we have
and, of course, an analogous expression exists for the conditional distribution of X given Y exchanging Y for X and vice versa.
and slope . Given a bivariate Normal, for particular values of X we will
find that the average value of Y lies on this straight line. The degree of scatter about this line is constant and is given by (1.8) .
Clarke and Kempson, Introduction to the Design and Analysis of Experiments, Arnold, London, 1997 Chapter 2
Senn, S.J. Cross-over Trials in Clinical Research, (2nd edition), Wiley, Chichester, 2002, Chapter 2.