Working to improve the lives of sleep apnea patients and researchers one commit at a time.
Boston, MA
0000-0002-7368-8374
https://github.com/remomueller
Top Topics
Recent Topics
Glad to hear it! We're also working on rechecking dates throughout our other datasets and reposting EDFs that had the issue, and I'll be putting together a general post of details soon that includes which datasets were updated. Thanks for letting us know, and don't hesitate to let us know of anything else you may come across. Thanks!
Edit: Blog Post now Live
Hello fasann,
I've put together a simple script that will update the start_date_of_recording for EDFs.
start_date_of_recording
The script is located here: https://gist.github.com/remomueller/9c0f5da531fde34da91a9c013f895fe2
You will need to have edfize v0.4.0 or higher installed as well: gem install edfize --no-document
gem install edfize --no-document
To run navigate in your terminal or command prompt to the folder of EDFs, and then run ruby rewrite_signal_date.rb
ruby rewrite_signal_date.rb
It will rewrite the EDFs in place, so make sure you run this on a copy of your data.
We'll also be updating the EDFs on the SHHS dataset, however this may take some time as we will run other tests against the EDFs before reposting as well.
Hope this helps!
Thanks for reaching out on the forum! We currently know about the issue with the invalid dates, and are going to work our way back through some of the originally posted datasets to set the time to 01.01.85. We have already started using this new approach in the MESA EDFs that are posted here. This date is chosen specifically to adhere to the two digit year clipping present in regular EDFs (not EDF+), as 1985 is the "clipping year", and as such the earliest date we can represent in a regular EDF.
We're also releasing an update to our EDF ruby tool edfize that we use internally to check that validity of EDFs tomorrow, that will include checking for invalid dates. I'll put together an example Ruby script that could demonstrate how to rewrite the date in the header as well, I'll post it in this thread in the next couple of days. Example gists using edfize: Check EDFs for unusual Physical Dimensions Check EDFs for signal labels that include a dash
I'll pass along to the team the issue with the switched physical min and max, however, I'm not sure there are any current plans to address that, as I believe it does not go against the EDF specifications, and represents the leads being reversed when collecting the signal initially. I think most EDF viewers provide the ability to flip a signal for cases like these.
Hope that helps, and let us know if you have any other questions! Thanks!
Thanks for posting Sara! We've also created a blog post for this to increase visibility to members of the site.
Hi Onur, and welcome to the site! Our team has reached out to the developers to see if there are any potential solutions for this. Also, are you using an EDF from the NSRR that is causing the viewer to fail? Thanks!
Thanks for all the help you've given over the years Susan! You've joined the ranks of our Past Contributors!
Dear Jing,
Many of the datasets hosted on the NSRR come with a Dataset Introduction. For example, you can find the one for the Sleep Heart Health Study here: https://sleepdata.org/datasets/shhs/pages/3-dataset-introduction.md
Each of the datasets has public documentation you can read through and you can use the following links as a starting place:
Dataset Introductions
Additionally, you can browse through variables for each dataset, and take a look at the files available for download. For example the SHHS files found here.
Hope that helps!
Hi Matthew,
These look like missing codes to me, which are typically removed during the SAS to CSV export for datasets. I'll follow up with @mrueschman to find more documentation on sof:v8age values.
I've also created GitHub Issue #35 to keep track of this.
Thanks for reporting!
Great to hear! Thanks for following up!
Hi Stephany and Mathias, thanks for reporting the issue! We updated the configuration for the servers to better handle files over 1GB, and have hopefully resolved the issues. If you run into the same problem going forward, please send us a message and we'll investigate further. Thanks again! Remo