Analysis of the reliability of the evaluation and scoring process for abstract submissions for the “Journées Francophones de la Kinésithérapie 2023”

Authors

  • Jean-Philippe Deneuville PRISMATIC Lab, Poitiers UniversityHospital; Institu Pprime UPR 3346, CNRS - Université de Poitiers https://orcid.org/0009-0004-2379-1383
  • Matthieu Gallou-Guyot Ochanomizu University, Department Center for Interdisciplinary AI and Data Science, Tokyo, Japon ; Laboratoire HAVAE, Université deLimoges, Limoges, France https://orcid.org/0000-0002-2616-4850
  • Matthieu Guémann 1-École Universitaire de Kinésithérapie du centre val de Loire -EUK-CVL, Université d'Orléans, France. 2-Sport, Physical Activity, Rehabilitation and Movement for Performance and Health (SAPRéM), Université d’Orléans, Orléans, France https://orcid.org/0000-0003-1896-2796

DOI:

https://doi.org/10.52057/erj.v4i1.46

Keywords:

Reliability, Internal consistency, Clinimetry, Conference abstracts assessment

Abstract

Background

The "Société Française de Physiothérapie" (SFP) organizes biennial conferences known as the "Journée Francophone de la Rééducation" (JFK) since 2007. The 2023 JFK's scientific committee invites researchers in rehabilitation and reeducation to submit their work prior to the conference. Submissions are evaluated by two reviewers using a predefined rating grid, determining acceptance for presentation. However, the reliability of this process has never been evaluated.

Objective

 This study aims to assess the reliability and internal consistency of the JFK submission rating process conducted by the scientific committee.

Method

 Each submission was evaluated blindly by two reviewers employing a standardized 47-item rating grid. The items were categorized into domains like methodology, relevance to physiotherapy, etc. Reliability was gauged using the Intraclass Coefficient Correlation (ICC) for final scores, Cohen's Kappa for individual grid items, and Cronbach Alpha for domain consistency.

Results

The reliability, measured by ICC, was poor (0.003 [CI95 = -0.003 ; 0.888]). None of the individual grid items' Cohen's Kappa exceeded 0.6. Out of 10 domains, 9 had a Cronbach Alpha above 0.7, indicating good consistency. However, 5 domains had an Alpha above 0.9, suggesting redundancy.

Conclusion

 The JFK 2023 submission rating process displayed poor reliability. These findings can guide improvements for the JFK 2025 scientific committee, facilitating an enhanced assessment process.

Downloads

Additional Files

Published

2024-02-23

Issue

Section

Original Research

Categories

Similar Articles

You may also start an advanced similarity search for this article.