-
Notifications
You must be signed in to change notification settings - Fork 141
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[MRG] Added epilepsy example dataset #133
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
There are still some errors to be corrected though (see comments).
regarding the travis error
3: [ERR] The column names of the channels file must begin with ['name', 'type', 'units', 'sampling_frequency', 'low_cutoff', 'high_cutoff', 'notch', 'reference'] (code: 72 - MISSING_TSV_COLUMN_IEEG_CHANNELS)
you can coordinate with Chris H. see: https://github.com/bids-standard/bids-validator/pull/652
Regarding the error
4: [ERR] Empty files not allowed. (code: 99 - EMPTY_FILE)
I have raised this issue here: https://github.com/bids-standard/bids-validator/issues/651#issuecomment-442821513
and it needs to be discussed, cc @chrisfilo
fz EEG uV n/a n/a 512 unknown good | ||
cz EEG uV n/a n/a 512 unknown good | ||
ecg1 ECG uV n/a n/a 512 unknown good | ||
ecg2 ECG uV n/a n/a 512 unknown good |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you are missing a column for the last 5 rows, if it's not known, fill it with n/a
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(see also below for similar)
"TaskName": "seizure", | ||
"SamplingFrequency": 512, | ||
"PowerLineFrequency": 50, | ||
"SoftwareFilters", "n/a", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
typo: comma should be a colon
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(see also below for similar)
Regarding the remaining warnings: should I add participants.json and _channels.json files? Regarding the empty files error: Were not supposed to post only empty data files in theis bids_examples repo? |
Yes, please specify a The "custom keys" are usually explicitly referred to (and described) in the specification ... such that additional information is not necessary. In your case Regarding Regarding a "template", please see: https://github.com/bids-standard/bids-examples/blob/bep006_eeg/eeg_matchingpennies/participants.json |
Thanks - Fixed So if I understand well, there is no way I can post this iEEG dataset openneuro yet? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
great, thanks! It's looking good now.
RE: openneuro:
- you would upload the full data there, so the "Empty File Error" wouldn't be applicable, would it?
- I think openneuro accepts bids data as long as it does not throw errors ... the complaint about non-described columns in channels.tv and electrodes.tsv is just a warning (soon to disappear, once the validator catches up) ... so you should be fine uploading to openneuro.
If @choldgraf or @dorahermes agree, I can merge this.
No errors because of the empty files or the columns in the .tsv files, but none of the .json ieeg files are accepted by the openneuro validator... |
ahhhh, I completely forgot ... openneuro is running This will only change once iEEG is merged into the main spec. Actually, we could already make that step with the validator IMHO, what's your take on this @chrisfilo? Would it be okay to remove the optional EEG and iEEG flags of the validator now? Or is that something we should do as the "last step of the merging process"? |
I would prefer to wait until it's merged into the spec. Hopefully won't be
long now.
…On Wed, Dec 5, 2018, 12:43 AM Stefan Appelhoff ***@***.*** wrote:
ahhhh, I completely forgot ... openneuro is running bids-validator ...
but for now, to validate iEEG data, we need to run bids-validator --bep010
.
This will only change once iEEG is merged into the main spec. Actually, we
could already make that step with the validator IMHO, what's your take on
this @chrisfilo <https://github.com/chrisfilo>? Would it be okay to
remove the optional EEG and iEEG flags of the validator now? Or is that
something we should do as the "last step of the merging process"?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#133 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAOkpx99gh_-9LtNKq-DDh5jel1KP56Aks5u110EgaJpZM4Y90Ra>
.
|
Mmh okay fair enough, that puts us into a dilemma situation though: @ftadel's data is part of iEEG bids, which we want to publish before merging ... but in order to publish, it needs to be uploaded (his choice: openneuro), which is only possible once the merge has happened ;-) Solutions I see:
|
When will this be merged? |
We are waiting for an approving review from @choldgraf or @dorahermes It seems that in the meantime, adjustments to the validator also made this PR not pass:
|
sorry, all of my cycles had gone towards getting the BIDS-iEEG paper submitted. I can try to take a stab at finishing up the validator code. Then we should re-run on this PR and see what modifications need to be made to ensure this dataset passes the validator |
Just a note that updates are now merged into the validator, so we should see how it works on this dataset once more. Looking at those errors:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Passed! |
Thanks @ftadel ! :-) |
No description provided.