Kappa {vcd}R Documentation

Cohen's Kappa and weighted Kappa

Description

Computes two agreement rates: Cohen's kappa and weighted kappa, and confidence bands.

Usage

Kappa(x, weights = c("Equal-Spacing", "Fleiss-Cohen"), conf.level = 0.95)

Arguments

x a confusion matrix.
weights either one of the two options or a user-specified matrix with same dimensions as x.
conf.level level for the confidence intervals.

Details

Cohen's kappa is the diagonal sum of the (possibly weighted) relative frequencies, corrected for expected values and standardized by its maximum value. The equal-spacing weights are defined by 1 - abs(i - j) / (r - 1), r number of colums/rows, and the Fleiss-Cohen weights by 1 - abs(i - j)^2 / (r - 1)^2. The latter ones attach greater importance to near disagreements.

Value

An object of class kappa with three components:

Kappa Kappa statistic, along with Approximate Standard Error (ASE) and 95% confidence bounds.
Kappa.Weighted idem for the weighted kappa.
Weights weight matrix used.

Author(s)

David Meyer
david.meyer@ci.tuwien.ac.at

References

  • Cohen, Jacob (1960): A coefficient of agreement for nominal scales. Educational and Psychological Measurement.
  • Everitt, B.S. (1968): Moments of statistics kappa and weighted kappa. The British Journal of Mathematical and Statistical Psychology.

    See Also

    agreementplot

    Examples

    data(SexualFun)
    Kappa(SexualFun)
    

    [Package Contents]