Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ENH] BEP 020 Eye Tracking #1128

Open
wants to merge 184 commits into
base: master
Choose a base branch
from
Open

[ENH] BEP 020 Eye Tracking #1128

wants to merge 184 commits into from

Conversation

mszinte
Copy link

@mszinte mszinte commented Jun 15, 2022

Here is the specifications of the BEP 020 about eye tracking.

  • it follows the main discussion initiated on a google document.
  • it includes the different modification the group of maintainers suggested to us during our zoom meeting.
  • it includes the macros as used in other modality specific extensions
  • it includes links toward dataset examples.

Note

We meet regularly and everyone is welcome :
Next meeting December 19th 2024 4pm UTC (EST 11am, PST 8am, CET 5pm, GMT 4pm) on zoom.

Note that if you consider joining but this time or day doesn't suits you, reach me (@mszinte) and I will arrange another appointment.

Notes of last meeting

Chat and discussions also happening on matrix

We are currently drafing a companion paper for this BEP, feel free to participate (GoogleDoc)

Issues for:


  • implement macros
    • for filename templates ?
    • for examples ?
    • for metadata table
  • add contributors to the wiki (so they can be added to the contributors page)
  • end docmentation
  • update examples
  • update validator
  • update list of contributors via the github wiki

@mszinte mszinte requested a review from tsalo as a code owner June 15, 2022 12:09
correction of text
@sappelhoff sappelhoff added the BEP label Jul 12, 2022
@sappelhoff
Copy link
Member

(NOTE: I'll cross-post this message across several BEP threads)

Hi there, just a quick notification that we have just merged #918 and it may be interesting to look at the implications for this BEP.

We are introducing "BIDS URIs", which unify the way we refer to and point to files in BIDS datasets (as opposed to "dataset-relative" or "subject-relative" or "file-relative" links).

If the diff and discussion in the PR is unclear, you can also read the rendered version: https://bids-specification.readthedocs.io/en/latest/02-common-principles.html#bids-uri

Perhaps there are things in the BEP that need adjusting now, but perhaps also not -- in any case it's good to be aware of this new feature!

Let me know if there are any questions, comments, or concerns.

mkdocs.yml Outdated Show resolved Hide resolved
src/04-modality-specific-files/10-eye-tracking.md Outdated Show resolved Hide resolved
src/04-modality-specific-files/10-eye-tracking.md Outdated Show resolved Hide resolved
src/04-modality-specific-files/10-eye-tracking.md Outdated Show resolved Hide resolved
src/04-modality-specific-files/10-eye-tracking.md Outdated Show resolved Hide resolved
src/schema/metadata/AOIDefinition.yaml Outdated Show resolved Hide resolved
src/04-modality-specific-files/10-eye-tracking.md Outdated Show resolved Hide resolved
src/04-modality-specific-files/10-eye-tracking.md Outdated Show resolved Hide resolved
src/04-modality-specific-files/10-eye-tracking.md Outdated Show resolved Hide resolved
src/04-modality-specific-files/10-eye-tracking.md Outdated Show resolved Hide resolved
@tsalo tsalo changed the title [ENH] Bep020 [ENH] BEP 020 Eye Tracking Aug 24, 2022
`sub-01_task-visualSearch_recording-eye1_physio.json` sidecar
could read:

```JSON
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

insert in example here

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What example is missing here?

@Remi-Gau
Copy link
Collaborator

Remi-Gau commented Aug 2, 2024

Discussed previously during some meetings with @oesteban @mszinte

Note that if the source data for single run was acquired by turning the eyetracker only during trials instead of keeping the eyetracking recording for the whole duration of the run, this will lead to discontinuous timestamps.

@julia-pfarr and I are encountering the issue in some of the datasets we are converting.

The decision for now is to pad the output files with rows of NaNs for the missing time points.
Note that this technically "creates" timepoints that were not recorded and that it may inflate the size of the output file.

Wonder if this consequence of using physio data for this kind of eyetracking acquisition should be mentioned somewhere in the spec, or if this is more a converter implementation detail + best practice recommendations for data acquistion...

@scott-huberty
Copy link

Hi everyone,

Note that if the source data for single run was acquired by turning the eyetracker only during trials instead of keeping the eyetracking recording for the whole duration of the run, this will lead to discontinuous timestamps.

Frustratingly, this is at least somewhat common in EyeLink eytrackers. Eyelink will also stop recording any time you enter a calibration sequence.

The decision for now is to pad the output files with rows of NaNs for the missing time points. Note that this technically "creates" timepoints that were not recorded and that it may inflate the size of the output file.

This is exactly what we did in the eyelink reader in MNE. I was also unsure if it was the right thing to do.. It makes me feel a little better about that decision, seeing you all arrive to the same conclusion independently 🙂

Wonder if this consequence of using physio data for this kind of eyetracking acquisition should be mentioned somewhere in the spec, or if this is more a converter implementation detail + best practice recommendations for data acquistion...

Just a heads up that our decision to pad with NaNs in MNE has caused its fair share of headaches, especially with signal processing routines (e.g. if you filter your pupil size signal, one of those NaN's could obliterate the whole signal!).

fields:
EnvironmentCoordinates: required
RecordedEye: required
SampleCoordinateUnits: required
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note that the unit may also described in the description of each column of the physio file so so this may conflict with also having it here, no?

@qian-chu
Copy link

Hi everyone thanks for the hard work for pushing this forward! I have a question/minor suggestion: when timestamp is among the columns, shouldn't we also encourage users to provide metadata about it? For example the unit (ms for Eyelink, ns for Pupil Labs) and the reference frame (UNIX time or time since system startup). The current example doesn't provide such info:

```JSON
{
"DeviceSerialNumber": "17535483",
"Columns": ["timestamp", "x_coordinate", "y_coordinate", "pupil_size"],
"EnvironmentCoordinates": "top-left",
"Manufacturer": "SR-Research",
"ManufacturersModelName": "EYELINK II CL v4.56 Aug 18 2010",
"RecordedEye": "right",
"SampleCoordinateSystem": "gaze-on-screen",
"SampleCoordinateUnits": "pixel",
"SamplingFrequency": 1000,
"SoftwareVersion": "SREB1.10.1630 WIN32 LID:F2AE011 Mod:2017.04.21 15:19 CEST",
"ScreenAOIDefinition": [
"square",
[100, 150, 300, 350]
],
"pupil_size": {
"Description": "Pupil area of the recorded eye as calculated by the eye-tracker in arbitrary units (see EyeLink's documentation for conversion).",
"Units": "a.u."
}
}
```

EncodingTechnique:
name: EncodingTechnique
display_name: Encoding Technique
description: |
The encoding technique used during readout.
For example, `"Cartesian"`, `"EPSI"`, `"Spiral"`,
or `"Density-weighted concentric ring trajectory"`.
type: string
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

typo due to merge conflict resolution

Copy link

@Sourav-Kulkarni Sourav-Kulkarni left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Inputs from previous meeting for inserting details about Eyetracking setup 'geometry'

src/schema/objects/metadata.yaml Show resolved Hide resolved
src/schema/objects/metadata.yaml Outdated Show resolved Hide resolved
@oesteban
Copy link
Collaborator

oesteban commented Oct 3, 2024

The decision for now is to pad the output files with rows of NaNs for the missing time points.
Note that this technically "creates" timepoints that were not recorded and that it may inflate the size of the output file.

Copy link
Author

@mszinte mszinte left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good to me, thanks

@arnodelorme
Copy link

We are eager to release our HBN dataset (2000+ subjects with eye tracking). I have provided several candidates on another thread matching the proposed format. When can we expect the format finalized? Tx

@mszinte
Copy link
Author

mszinte commented Oct 12, 2024

We are eager to release our HBN dataset (2000+ subjects with eye tracking). I have provided several candidates on another thread matching the proposed format. When can we expect the format finalized? Tx

To my opinion the BEP is ready, we are finishing examples and testing deeply our converter.
We hope very soon that we will ask for BIDS community review and go through the final steps before finalization.

I would then suggest you to take it as it is and keep an eye on this thread in case some reviewer makes important changes when we will open it to review.

Best,

name: pupil_size
display_name: Pupil size
description: |
Pupil size or area of the recorded eye, in the

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we recording the particular type of pupil size that's being recorded here ('diameter' v/s 'area')?

The data that comes out of Eyelink is always in 'arbitrary units' (Eyelink documentation: page 106, 4.4.3) and the actual data type is stored in the settings (Eyelink documentation: page 25 'Pupil Size Data').

@scott-huberty
Copy link

Hi all - I lost track of the meeting schedule. Is there one scheduled in the next month?

@julia-pfarr
Copy link
Member

Hi all - I lost track of the meeting schedule. Is there one scheduled in the next month?

Not yet! Please fill out to find a time for the next meeting: https://doodle.com/meeting/participate/id/b2OzP3jd. Times should be displayed in your time zone. I tried to choose times that are manageable in all the time zones.

display_name: Coordinate System of the Gaze Position
description: |
Coordinate system of the gaze position recordings.
Generally eye-tracker are set to use `"gaze-on-screen"` coordinate system but you may use
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Generally eye-tracker are set to use `"gaze-on-screen"` coordinate system but you may use
Generally eye-trackers are set to use `"gaze-on-screen"` coordinate system, but you may use

Coordinate system of the gaze position recordings.
Generally eye-tracker are set to use `"gaze-on-screen"` coordinate system but you may use
`"eye-in-head"` or `"gaze-in-world"` or other alternatives of your choice.
If you use the standard `"gaze-on-screen"`, it is RECOMMENDED to use this
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would make this an enum and allow "custom" so the user can specify a system that is not captured by the enum.

selectors:
- suffix == "physio"
fields:
EnvironmentCoordinates: required
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This one is very problematic if REQUIRED, what happens when "SampleCoordinateSystem" is not "gaze-on-screen"?

Does this make sense for "eye-in-head" or "gaze-in-world"?

"EnvironmentCoordinates": "top-left",
"Manufacturer": "SR-Research",
"ManufacturersModelName": "EYELINK II CL v4.56 Aug 18 2010",
"RecordedEye": "right",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did we forget this?? 👀

Suggested change
"RecordedEye": "right",
"PhysioType": "eyetrack",
"RecordedEye": "right",

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.