Published online Jan 18, 2019. doi: 10.5312/wjo.v10.i1.14
Peer-review started: September 12, 2018
First decision: October 15, 2018
Revised: October 27, 2018
Accepted: December 24, 2018
Article in press: December 24, 2018
Published online: January 18, 2019
To investigate the inter- and intra-rater reliability of the vertebral fracture classifications used in the Swedish fracture register.
Radiological images of consecutive patients with cervical spine fractures (n = 50) were classified by 5 raters with different experience levels at two occasions. An identical process was performed with thoracolumbar fractures (n = 50). Cohen’s kappa was used to calculate the inter- and intra-rater reliability.
The mean kappa coefficient for inter-rater reliability ranged between 0.54 and 0.79 for the cervical fracture classifications, between 0.51 and 0.72 for the thoracolumbar classifications (overall and for different sub classifications), and between 0.65 and 0.77 for the presence or absence of signs of ankylosing disorder in the fracture area. The mean kappa coefficient for intra-rater reliability ranged between 0.58 and 0.80 for the cervical fracture classifications, between 0.46 and 0.68 for the thoracolumbar fracture classifications (overall and for different sub classifications) and between 0.79 and 0.81 for the presence or absence of signs of ankylosing disorder in the fracture area.
The classifications used in the Swedish fracture register for vertebral fractures have an acceptable inter- and intra-rater reliability with a moderate strength of agreement.
Core tip: The Swedish Fracture register gathers national data on fractures. We adapted commonly used classifications for spine fractures and studied inter and intra-rater reliability as a basis for future usage of the register, including research on outcome after spine fractures.