1 / 23

Menghitung Korelasi Bivariat menggunakan SPSS

Menghitung Korelasi Bivariat menggunakan SPSS. Pearson's correlation coefficient, Spearman's rho, and Kendall's tau-b. The Bivariate Correlations.

alodie
Download Presentation

Menghitung Korelasi Bivariat menggunakan SPSS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MenghitungKorelasi Bivariatmenggunakan SPSS Pearson's correlation coefficient, Spearman's rho, and Kendall's tau-b

  2. The Bivariate Correlations • …procedure computes the pair-wise associations for a set of variables and displays the results in a matrix. It is useful for determining the strength and direction of the association between two scale or ordinal variables.

  3. Pairwise: When computing a measure of association between two variables in a larger set, cases are included in the computation when the two variables have non-missing values, irrespective of the values of the other variables in the set. • Scale: A variable can be treated as scale when its values represent ordered categories with a meaningful metric, so that distance comparisons between values are appropriate. • Ordinal: A variable can be treated as ordinal when its values represent categories with some intrinsic ranking; for example, levels of service satisfaction from highly dissatisfied to highly satisfied.

  4. Correlation • … measure how variables or rank orders are related. • Before calculating a correlation coefficient, screen your data for outliers (which can cause misleading results) and evidence of a linear relationship.

  5. Pearson's correlation coefficient is a measure of linear association. Two variables can be perfectly related, but if the relationship is not linear, Pearson's correlation coefficient is not an appropriate statistic for measuring their association. • Pearson correlation coefficients assume the data are normally distributed.

  6. To Obtain Bivariate Correlations From the menus choose:   Analyze  CorrelateBivariate...  Select two or more numeric variables.

  7. The following options are also available: • Correlation Coefficients. • For quantitative, normally distributed variables, choose the Pearson correlation coefficient. • If your data are not normally distributed or have ordered categories, choose Kendall's tau-b or Spearman, which measure the association between rank orders.

  8. The following options are also available: • Test of Significance. You can select two-tailed or one-tailed probabilities. If the direction of association is known in advance, select One-tailed. Otherwise, select Two-tailed.

  9. The following options are also available: • Flag significant correlations. Correlation coefficients significant at the 0.05 level are identified with a single asterisk, and those significant at the 0.01 level are identified with two asterisks.

  10. Result: Pearson’s correlation

  11. Result: Spearman’s rho correlation

  12. The correlations table displays Pearson correlation coefficients, significance values, and the number of cases with non-missing values.

  13. Pearson correlation coefficients • The Pearson correlation coefficient is a measure of linear association between two variables. • The values of the correlation coefficient range from 0 to 1. • The sign of the correlation coefficient indicates the direction of the relationship (positive or negative).

  14. Pearson correlation coefficients • The absolute value of the correlation coefficient indicates the strength, with larger absolute values indicating stronger relationships. • The correlation coefficients on the main diagonal are always 1.0, because each variable has a perfect positive linear relationship with itself. • Correlations above the main diagonal are a mirror image of those below.

  15. In this example, the correlation coefficient for Arimatika and Loneliness is 0.837. Since 0.837 is relatively close to 1, this indicates that Arimatika and Loneliness are positively correlated.

  16. Significance Values • The significance level (or p-value) is the probability of obtaining results as extreme as the one observed. • If the significance level is very small (less than 0.05) then the correlation is significant and the two variables are linearly related. • If the significance level is relatively large (for example, 0.50) then the correlation is not significant and the two variables are not linearly related.

  17. The significance level or p-value is 0.000 which indicates a very low significance. • The small significance level indicates that Aritmatika and Loneliness are significantly positively correlated.

  18. Significance Values • The significance level (or p-value) is the probability of obtaining results as extreme as the one observed. • If the significance level is very small (less than 0.05) then the correlation is significant and the two variables are linearly related. • If the significance level is relatively large (for example, 0.50) then the correlation is not significant and the two variables are not linearly related.

  19. Significance Values As Aritmatika increases Loneliness also increases. And as Aritmatika decreases, Loneliness also decreases.

  20. N is the number of cases with non-missing values. • In this table, the number of cases with non-missing values for both Aritmatika and Loneliness is 23.

  21. Notice • Even if the correlation between two variables is not significant, the variables may be correlated but the relationship is not linear. • So, we use Spearman’s rho or Kendall Tau-b

More Related