1 / 32

Variance and covariance

General additive models. Variance and covariance. Sums of squares. M contains the mean. The coefficient of correlation. We deal with samples. For a matrix X that contains several variables holds. The diagonal matrix S X contains the standard deviations as entries .

hasad
Download Presentation

Variance and covariance

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. General additivemodels Variance and covariance Sums of squares Mcontainsthemean

  2. Thecoefficient of correlation We dealwithsamples For a matrixXthatcontainsseveralvariablesholds The diagonal matrixSXcontainsthe standard deviations as entries. X-Miscalledthecentral matrix. ThematrixRis a symmetricdistancematrixthatcontainsallcorrelationsbetweenthevariables

  3. Pre-andpostmultiplication Premultiplication Postmultiplication For diagonal matricesXholds

  4. Linearregression European bat species and environmentalcorrelates

  5. N=62 Matrixapproach to linearregression Xis not a squarematrix, henceX-1doesn’texist.

  6. Thespecies – arearelationship of Europeanbats Whataboutthe part of varianceexplained by our model? 1.16: Averagenumber of species per unit area (speciesdensity) 0.24: spatialspeciesturnover

  7. How to interpretthecoefficient of determination Total variance Rest (unexplained) variance Residual (explained) variance Statisticaltestingisdone by an F or a t-test.

  8. The general linear model A model thatassumesthat a dependent variable Y can be expressed by a linearcombination of predictorvariables X iscalled a linear model. ThevectorEcontainstheerrorterms of eachregression. Aimis to minimizeE.

  9. The general linear model Iftheerrors of thepreictorvariablesareGaussiantheerror term e shouldalso be Gaussian and means and variancesareadditive Total variance Explainedvariance Unexplained(rest) variance

  10. Multipleregression Model formulation Estimation of model parameters Estimation of statisticalsignificance

  11. Multiple R and R2

  12. Thecoefficient of determination x1 y xm x2 Thecorrelationmatrixcan be devidedintofourcompartments.

  13. R: correlationmatrix n: number of cases k: number of independent variablesinthe model D<0 isstatistically not significant and should be eliminatedfromthe model. Adjusted R2

  14. A mixed model

  15. Thefinal model Negativespeciesdensity Realisticincrease of speciesrichnesswitharea Increase of speciesrichnesswithwinterlength Increase of speciesrichnessathigherlatitudes A peak of speciesrichnessatintermediatelatitudes Isthis model realistic? The model makes a series of unrealisticpredictions. Ourinitialassumptionsarewrongdespite of the high degree of varianceexplanation Our problem arisesin part fromtheintercorrelationbetweenthepredictorvariables (multicollinearity). We solvethe problem by a step-wiseapproacheliminatingthevariablesthatareeither not significantorgiveunreasonableparametervalues Thevarianceexplanation of thisfinal model ishigherthanthat of theprevious one.

  16. Multiple regression solves systems of intrinsically linear algebraic equations Polynomialregression General additive model • The matrix X’X must not be singular. It est, the variables have to be independent. Otherwise we speak of multicollinearity. Collinearity of r<0.7 are in most cases tolerable. • Multiple regression to be safely applied needs at least 10 times the number of cases than variables in the model. • Statistical inference assumes that errors have a normal distribution around the mean. • The model assumes linear (or algebraic) dependencies. Check first for non-linearities. • Check the distribution of residuals Yexp-Yobs. This distribution should be random. • Check the parameters whether they have realistic values. Multiple regression is a hypothesis testing and not a hypothesis generating technique!!

  17. Standardizedcoefficients of correlation Z-tranformeddistributionshave a mean of 0 an a standard deviation of 1. In thecase of bivariateregression Y = aX+b, Rxx = 1. HenceB=RXY. Hencetheuse of Z-transformedvaluesresultsinstandardizedcorrelationscoefficients, termedb-values

  18. How to interpret beta-values • Ifthen • Beta valuesare generalisations of simple coefficients of correlation. However, there is an important difference. The higher the correlation between two or more predicator variables (multicollinearity) is, the less will r depend on the correlation between X and Y. Hence other variables might have more and more influence on r and b. For high levels of multicollinearity it might therefore become more and more difficult to interpret beta-values in terms of correlations. Because beta-values are standardized b-values they should allow comparisons to be make about the relative influence of predicator variables. High levels of multicollinearity might let to misinterpretations. Beta valuesabove one arealways a sign of too high multicollinearity • Hence high levels of multicollinearity might • reduce the exactness of beta-weight estimates • change the probabilities of making type I and type II errors • make it more difficult to interpret beta-values. • We mightapplyan additional parameter, the so-called coefficient of structure. The coefficient of structure ci is defined as • where riY denotes the simple correlation between predicator variable i and the dependent variable Y and R2 the coefficient of determination of the multiple regression. • Coefficients of structure measure therefore the fraction of total variability a given predictor variable explains. Again, the interpretation of ci is not always unequivocal at high levels of multicollinearity.

  19. Partial correlations Thepartialcorrelationrxy/zisthecorrelation of theresidualsDX and DY Semipartial correlation A semipartial correlation correlates a variable with one residual only.

  20. Pathanalysisandlinearstructuremodels Multiple regression Theerror term e containthe part of thevariancein Y thatis not explained by the model. Theseerrorsarecalledresiduals Regressionanalysisdoes not studytherelationshipsbetweenthepredictorvariables Pathanalysis defines a whole model and triesto separate correlations into direct and indirect effects Path analysis tries to do something that is logically impossible, to derive causal relationships from sets of observations.

  21. Path analysis is largely based on the computation of partial coefficients of correlation. Pathcoefficients Path analysis is a model confirmatory tool. It should not be used to generate models or even to seek for models that fit the data set. We start from regression functions

  22. FromZ-transformedvalues we get eZY = 0 ZYZY = 1 Pathanalysisis a nice tool to generatehypotheses. Itfailsatlowcoefficients of correlation and circular model structures. ZXZY = rXY

  23. Non-metricmultipleregression

  24. Statisticalinference Roundingerrorsdue to differentprecisionscausetheresidualvariance to be largerthanthetotalvariance.

  25. Logistic and other regression techniques We useodds The logistic regression model

  26. Generalized non-linear regression models A special regression model that is used in pharmacology b0 is the maximum response at dose saturation. b1 is the concentration that produces a half maximum response. b2 determines the slope of the function, that means it is a measure how fast the response increases with increasing drug dose.

More Related