These combine with two operational definitions of behaviour: Proc freq data-kappa_test; Table observer1-beobachter2 / approve; Execution; Bland JM, DG Altman. The measurement agreement in method comparison studies. Med Res Stat Methods. 1999;8 2):135-60. Barnhart HX, Haber MJ, Lin LI. An overview of the assessment of compliance with ongoing measures. J Biopharm Stat. 2007;17(4):529-69. The objective of this study was to estimate inter-observer agreement and intra-observer agreement in the radiographic diagnosis of AVN after SCFE. The results of this study showed that, according to the Landis and Koch criteria [4], the observation agreement was „almost perfect“ for both observers and that the inter-observer agreement was „important“.

The importance of these data refers to the interpretation of the differences in risk of AVN reported by unstable SCFE, as Loder et al. reported, at 47% compared to the 15% risk reported in other studies [2, 3]. Based on the results of this study, it seems unlikely that this discrepancy occurred due to differences among observers on the radiological diagnosis of AVN. It was outside the scope of this study to explore other possible explanations for the difference, for example. B different patient populations, measurement errors, different experience levels among observers or different dates for obtaining x-rays. In the study horizon, 103 cases of SCFE were treated in our centre. Of these, four were diagnosed with AVN of the femur head (fig. 1) and the remaining 99 were not diagnosed with AVN (Fig. 2). Each case of AVN showed typical results of thigh fragmentation and collapse.

The first observer identified eight cases in the first and seven in the second trial. The kappa value of the intra-observer agreement was 0.9. The second observer recorded six cases of AVN in the first and five in the second attempt. The kappa value of the intra-observer agreement was 0.88. The inter-observer agreement was established during the first attempt, and Kappa`s value was 0.79. Repeatability in relation to reproducibility: repeatability establishes the proximity of the match between the measurements in the same condition, i.e. with the same laboratory, with the same observer and the same equipment (scanner PET, image reconstruction software), at short intervals.

||||| Like It 0 Sehr geil! |||||