Thanks for checking out the site. The records were scored by trained sleep clinic staff members. Someone emailed with a similar question recently. Here's my response:
Thanks for using the site. Yes, the STAGES PSGs were manually scored. The CSV annotation files include sleep staging.
The MOP (https://sleepdata.org/datasets/stages/files/documentation/STAGES%20MOP%202018-08-09.pdf) contains this bit:
The PSGs were conducted using each site’s clinic protocols, and I assume likewise for the subsequent scoring. We didn’t receive detailed information about the sleep centers or scoring staff, but my presumption is that all the sleep centers were AASM-accredited.
I don't recall any of our datasets having ECG sampling frequencies above 256 (e.g., CFS, MESA). Thanks for checking out the site!
Thanks for checking out the site. It doesn't look like the staging for the IS-RC cohort (70 PSGs scored by 6 scorers) were shared with us originally. I see that the staging files were made available in the public Stanford Box share here: https://stanfordmedicine.app.box.com/s/r9e92ygq0erf7hn5re6j51aaggf50jly/folder/53209541138
Hey JaHyungKoo - thanks for checking out the site and asking all these questions.
We have many different AHI-like variables. These are mostly continuous metrics (# of events per hour) and it is up to the user to convert them into a categorical breakdown like 0-5, 5-15, 15-30, 30+. One of the most "general" AHI-metrics we have is this harmonized term - https://sleepdata.org/datasets/shhs/variables/nsrr_ahi_hp3r_aasm15
Yes, I suggest you create your own categorical indicator from one of the continuous AHI variables.
Yes, in this instance, "Not applicable" will also mean "No" for ahiov50.
Thanks for using the site and for your willingness to share your tool. Maybe you could host the code on GitHub/GitLab or a similar service? Feel free to post the link here when that's complete!
Some of the NSRR tools in ongoing development have public repositories here: https://gitlab-scm.partners.org/zzz-public
Yes, I think running with the "--fast" flag should work. You can't tell the gem to start at a specific file; it still needs to run through all the prior files and compare filesizes.
Hi - I'll ask another member of the team to comment on your first two questions.
Regarding #3 - SHHS had a 12-lead ECG reading done in clinic (i.e. not part of the overnight PSG). Some result variables from the 12-lead ECG are here: https://sleepdata.org/datasets/shhs/variables?folder=Clinical+Data%2FDiagnostic+Studies%2FElectrocardiogram
It sounds like for your research you will focus on the ECG signals in the overnight PSG recordings.
Good questions - this sort of "reliability" was assessed for the beginning and end of every REST interval in the actigraphy scoring. The actigraphy scorer made a quick judgement about whether indicators (e.g., event marker from device button press, diary, light levels, activity levels) were reliable with one another and recorded these on the QS form.
whether the "interval" info reported in the actigraphy file is considered reliable or not
All of the REST intervals in the actigraphy files were deemed to be valid and usable by the scorer. If, for instance, a day of actigraphy had lots of non-wear time (particularly during a nighttime sleep window) then the day would be entirely excluded and you would not find an interval in the actigraphy file.
The "reliability" here is an assessment of how well the different indicators align (or don't). It is not a judgment on whether or not the interval itself is usable (or not).
if I can access the reported reliability on a detected rest episode basis
No, these reliability indicators (from QS form) were not included in the shared data package.
I ran the command "nsrr download shhs/polysomnography/edfs --fresh" and it worked.
Please post a screenshot if you don't get the same result.
Thanks for using the site. What happens if you run the nsrr download shhs command? Try it without entering the gem console.