Difference between revisions of "Intercoder reliability"
From Sustainability Methods
Line 8: | Line 8: | ||
<!-- The next paragraph aims at a short overview what the following article is about. → | <!-- The next paragraph aims at a short overview what the following article is about. → | ||
− | + | ''' In short:''' Intercoder reliability is a fundamental aspect of good Content Analysis, meaning that different researchers code the same material and their results are cross-referenced. This can help identify discepancies in how each coder interprets the material, and highlights the importance of a clearly defined and transparently communicated codebook. By cross-referencing the coding results, the research team can slam on the brakes and order a re-iteration of the codebook, the coding instructions, and the coding process. On the flipside, a negligence of this intercoder evaluation can enforce any of the biases mentioned in this article. When only one or few coding results are created, especially for ambiguous data sources, the synthesized results can be highly influenced by these few perspectives. | |
Line 26: | Line 26: | ||
== Introduction == | == Introduction == | ||
− | |||
<!-- The following creates a table with 3 rows and 4 columns (phase, step, description, output). | <!-- The following creates a table with 3 rows and 4 columns (phase, step, description, output). |
Revision as of 10:32, 23 May 2024
Contents
Introduction
The table below displays an ideal-typical process of the XYZ.
Header text | Header text | Header text | Header text |
---|---|---|---|
Example | Example | Example | Example |
Example | Example | Example | Example |
Example | Example | Example | Example |
Example | Example | Example | Example |
What the method does
Strength and challenges
Normativity
Outlook
Key publications and References
According to Chicago Style conventions
The author of this entry is Max Mustermann.