We use cookies and other tools to enhance your experience on our website and to analyze our web traffic.
For more information about these cookies and the data collected, please refer to our Privacy Policy.

Sleep Heart Health Study

6.4 Quality Assurance and Quality Control

6.4.1 Weekly QA Exercises at the SRC

Weekly scorers’ staff meetings will be conducted at the SRC, with dedication of between 1.5 to 3 hours of time per week for the scorers to meet and discuss scoring issues. The weekly exercises will be organized by the Chief Polysomnologist, with topics that will reflect ongoing needs as identified by the staff or by the CC/PSG Committee. These exercises will include:

  • scoring of randomly chosen epochs
  • scoring of problem records/epochs
  • discussions of any problematic rules/examples identified during scoring
  • rotating “paired” scoring exercises

Exercises will include individual scoring with follow-up discussion of any discrepancies in assigned scores. A discussion, with participation of SRC investigators, will establish a “consensus” score for discrepant records. Results of any discrepancies between scorers and the Chief Polysomnologist or between any scorers will be reviewed at weekly meetings with investigators. The results of the deliberations will be kept on file. This will include copies of ambiguous records and a summary of any arbitration.

Examples of organized meetings include:

  • QA Exercises: At least monthly, individual scorers will score the same 50 epochs from a selected study. Most studies for QA scoring will be chosen randomly; however, scorers also will identify problematic studies that may show useful training/teaching points during QA exercises. Scoring of such designated studies will be recorded on an epoch by epoch basis. Differences in any epoch or event assignment between scorers will be discussed during the weekly QA meeting. Results of this discussion will be noted on the separate form (group consensus). When a group consensus cannot be reached, the epoch or event will be designated “indeterminate”. Data will be entered into a database and summarized quarterly for internal QA tracking.

  • Scoring Exercises: On the weeks when no QA exercise is performed, 30-60 minutes of paired scoring (one scorer scoring, one watching) or team scoring (with group members acting as a single unit) will be done. Noted differences will be discussed after the scoring exercise. Scorers will rotate roles, partners, and teams so that all interactions occur over any given time period.

  • Studies which pose difficulties in scoring or present interesting problems will be reviewed by the entire SRC staff during weekly meetings. Minutes from these meetings and printed copies of problem epochs will be maintained.

6.4.2 Tracking of QA/QC Data

The Chief Polysomnologist will track two types of data: data from the actual scored SHHS records and data generated during QA exercises. Using actual scored SHHS data, the overall mean RDI, sleep stage values and arousal indexes will be tracked for each scorer. If average values differ by > 15 % for any given scorer, those records will be reviewed by the Chief Polysomnologist. The SRC Director will determine whether re-training and re-standardization are required. Using data from the QA exercises, levels of agreement will be determined among scorers and trends tracked over time. Any scorer identified to deviate excessively (>10% from the consensus statement) on 3 consecutive exercises will be “re-trained.” “Re-training” will be considered successful if review of at least 5 additional studies demonstrates no deviation from scoring protocol, and the subsequent QA exercises show no deviations in performance compared to scoring assignments made by the other scorers.

At monthly QA meetings, the statistics summarizing inter-and intra-scorer variability will be reviewed by the SRC staff. They will identify any explanations for differences. If differences between scorers can not be explained by real differences in the studies assigned to any given scorer in any given time period, scorers noted to score differently will score together, concentrating on the areas where differences were noted. The Chief Polysomnologist will review consecutive records and reinstruct the scorer. Subsequent scoring will be monitored until conformity is demonstrated.

6.4.3 Outlier Checks Study-by-study outlier review

After each study is scored and an initial report is generated, the scorer will use a computer program to identify extreme outliers. These entries will be reviewed and the results of this review will be noted on a PSG scoring form. New reports will be generated if any editing of the record is required. Batch outlier review

On a weekly basis, prior to sending reports to sites, the reports will be subjected to a secondary review for outliers. Any records with outliers so identified will be checked (QS form/log book) to ascertain that this record was previously identified as containing an outlier, with adequate documentation of the problem.

On a monthly basis, scored PSG data will be imported into a permanent SAS file. Prior to importing these data, it will be subjected to a third check for outliers.

6.4.4 External Review of Scoring Reliability

The Coordinating Center and Polysomnography (PSG) Committee will establish methods for tracking scoring reliability and drift. This includes re-processing of records for reliability exercises and generation of summary data for each scorer. The overall mean for: RDI, sleep stage percentage, and arousal indexes will be calculated for each scorer over discrete time periods (monthly to quarterly) and reported to the PSG Subcommittee/Coordinating Center (CC). Intra- and inter-reader differences in the summary data for RDI, sleep stages, and arousal indices will be determined. The same mean values will be calculated for each site in the same periods of the time. The PSG Subcommittee/CC will monitor intra- and inter-reader reliability to determine the threshold for requiring remediation, including retraining or removing a reader.

The PSG Committee, in conjunction with the Coordinating Center (CC) may design a formal reliability study of scoring. For example, a sample of previously read PSG studies will be assigned a new “dummy” identification number and be assigned to the scorers for repeat scoring. Scoring of these studies will be integrated into the normal work flow to minimize their likelihood of being identified as a special study. Studies will be assigned to a different reader to define inter-reader reliability. Studies will be also assigned to the same reader at defined time periods to define intra-reader reliability.

Site visits, coordinated by the CC, will occur as directed by the Steering Committee.

National Sleep Research Resource
Sleep Heart Health Study