Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

basic eyetracker functionality #10855

Closed
wants to merge 31 commits into from

Conversation

dominikwelke
Copy link
Contributor

new raw class for eyetracking data
with reader functions for various dataformats
alignment and merge functionality with other MNE objects planned

dev and testing using data recorded with SR research eyelink 1000+

closes #10751

see also mne-tools/fiff-constants#39

- create new raw class for et data recorded with SR research's eyelink system
- can read eyelink .asc files (using djangrew's ParseEyelinkAscFiles function - https://github.com/djangraw/ParseEyeLinkAscFiles/blob/master/ParseEyeLinkAsc.py)
- tbd: adding annotations, create info from file-header
- fill missing data in epoched recording with nan
- work on parsing errors
exclude rows that have negative time values
@dominikwelke
Copy link
Contributor Author

hi all (@larson, @drammock )

here is some initial commits. i will contact you on discord

@drammock
Copy link
Member

@dominikwelke FYI I've just tested the current state of the PR with an environment that uses scipy==1.8.0 and it worked fine. So I think we can chalk up the previous failure to "something went wrong with your environment" and not worry about what exactly it was.

Also here is what I see when plotting:
Screenshot_2022-06-30_09-26-16

I think we need to work on scalings next, so that the traces are a bit more intelligible. See here:

# scalings for the units
scalings=dict(mag=1e15, grad=1e13, eeg=1e6, eog=1e6, emg=1e6, ecg=1e6,
misc=1.0, seeg=1e3, dbs=1e6, ecog=1e6, dipole=1e9, gof=1.0,
bio=1e6, hbo=1e6, hbr=1e6, ref_meg=1e15,
fnirs_cw_amplitude=1.0, fnirs_fd_ac_amplitude=1.0,
fnirs_fd_phase=1., fnirs_od=1.0, csd=1e3, whitened=1.),
# rough guess for a good plot
scalings_plot_raw=dict(mag=1e-12, grad=4e-11, eeg=20e-6, eog=150e-6,
ecg=5e-4, emg=1e-3, ref_meg=1e-12, misc='auto',
stim=1, resp=1, chpi=1e-4, exci=1, ias=1, syst=1,
seeg=1e-4, dbs=1e-4, bio=1e-6, ecog=1e-4, hbo=10e-6,
hbr=10e-6, whitened=10., fnirs_cw_amplitude=2e-2,
fnirs_fd_ac_amplitude=2e-2, fnirs_fd_phase=2e-1,
fnirs_od=2e-2, csd=200e-4,
dipole=1e-7, gof=1e2),

@dominikwelke
Copy link
Contributor Author

dominikwelke commented Jun 30, 2022

I think we need to work on scalings next, so that the traces are a bit more intelligible. See here:

sounds good!
note that I still use the misc channel type so far, as the fif-pr mne-tools/fiff-constants#39 is not merged yet.
plus, i dont know what to change in the mne codebase, to add the new channel type.
any pointer @drammock @larsoner ?

x-y coordinates are in screen pixels (in my case), but they might also come as cm or degrees of visual angle with other systems.. pupil area is in arbitrary values. not sure what other manufacturers put.

@drammock
Copy link
Member

i dont know what to change in the mne codebase, to add the new channel type.

Once mne-tools/fiff-constants#39 is merged, then here:

#
# Channel types
#
FIFF.FIFFV_BIO_CH = 102
FIFF.FIFFV_MEG_CH = 1
FIFF.FIFFV_REF_MEG_CH = 301
FIFF.FIFFV_EEG_CH = 2
FIFF.FIFFV_MCG_CH = 201
FIFF.FIFFV_STIM_CH = 3
FIFF.FIFFV_EOG_CH = 202
FIFF.FIFFV_EMG_CH = 302
FIFF.FIFFV_ECG_CH = 402
FIFF.FIFFV_MISC_CH = 502
FIFF.FIFFV_RESP_CH = 602 # Respiration monitoring
FIFF.FIFFV_SEEG_CH = 802 # stereotactic EEG
FIFF.FIFFV_DBS_CH = 803 # deep brain stimulation
FIFF.FIFFV_SYST_CH = 900 # some system status information (on Triux systems only)
FIFF.FIFFV_ECOG_CH = 902
FIFF.FIFFV_IAS_CH = 910 # Internal Active Shielding data (maybe on Triux only)
FIFF.FIFFV_EXCI_CH = 920 # flux excitation channel used to be a stimulus channel
FIFF.FIFFV_DIPOLE_WAVE = 1000 # Dipole time curve (xplotter/xfit)
FIFF.FIFFV_GOODNESS_FIT = 1001 # Goodness of fit (xplotter/xfit)
FIFF.FIFFV_FNIRS_CH = 1100 # Functional near-infrared spectroscopy
_ch_kind_named = {key: key for key in (
FIFF.FIFFV_BIO_CH,
FIFF.FIFFV_MEG_CH,
FIFF.FIFFV_REF_MEG_CH,
FIFF.FIFFV_EEG_CH,
FIFF.FIFFV_MCG_CH,
FIFF.FIFFV_STIM_CH,
FIFF.FIFFV_EOG_CH,
FIFF.FIFFV_EMG_CH,
FIFF.FIFFV_ECG_CH,
FIFF.FIFFV_MISC_CH,
FIFF.FIFFV_RESP_CH,
FIFF.FIFFV_SEEG_CH,
FIFF.FIFFV_DBS_CH,
FIFF.FIFFV_SYST_CH,
FIFF.FIFFV_ECOG_CH,
FIFF.FIFFV_IAS_CH,
FIFF.FIFFV_EXCI_CH,
FIFF.FIFFV_DIPOLE_WAVE,
FIFF.FIFFV_GOODNESS_FIT,
FIFF.FIFFV_FNIRS_CH,
)}

@larsoner
Copy link
Member

Don't wait for fiff-constants. Change the lines at the top of test_constants.py to point to your fork + branch that is the same as the branch that you opened in the PR to fiff-constants

@dominikwelke
Copy link
Contributor Author

Don't wait for fiff-constants. Change the lines at the top of test_constants.py to point to your fork + branch that is the same as the branch that you opened in the PR to fiff-constants

like this @larsoner ?

@larsoner
Copy link
Member

like this @larsoner ?

Pull the tiny commit I just made then locally do:

pytest mne/io/tests/test_constants.py

and see if it passes -- let pytest tell you if it's correct :)

@dominikwelke
Copy link
Contributor Author

nope, doesnt pass..
somethink seems wrong with the coil location, but i dont know where its set.

here the error log:

========================================================================================== FAILURES ===========================================================================================
_______________________________________________________________________________________ test_constants ________________________________________________________________________________________
mne/io/tests/test_constants.py:287: in test_constants
    assert val in fif[check], '%s: %s, %s' % (check, val, name)
E   AssertionError: coil: 402 (FIFFV_COIL_EYETRACK_PUPIL), FIFFV_COIL_EYETRACK_PUPIL
E   assert 402 (FIFFV_COIL_EYETRACK_PUPIL) in {0: ['none', 'The location info contains no data'], 1: ['eeg', 'EEG electrode position in r0'], 2: ['nm_122', 'Neuromag 122 coils'], 3: ['nm_24', 'Old 24 channel system in HUT'], ...}
------------------------------------------------------------------------------------ Captured stderr call -------------------------------------------------------------------------------------
Downloading data from 'https://codeload.github.com/dominikwelke/fiff-constants/zip/3da188c2e0d391bed1e4dd023eb07c909c273218' to file '/private/var/folders/fm/wcy0x7fs7f38_95zw5gj0r7073ffwq/T/pytest-of-dominik.welke/pytest-3/test_constants0/fiff.zip'.
SHA256 hash of downloaded file: 0c7f49a4d2194900390fbaa70ad80b0a3003fdc1b633db8593a673b8a9c44ffa
Use this value as the 'known_hash' argument of 'pooch.retrieve' to ensure that the file hasn't changed if it is downloaded again in the future.
------------------------------------------------ generated xml file: /Users/dominik.welke/Work/git_contributions/mne-python/junit-results.xml -------------------------------------------------
==================================================================================== slowest 20 durations =====================================================================================
1.07s setup    mne/io/tests/test_constants.py::test_constants
1.06s call     mne/io/tests/test_constants.py::test_constants

(18 durations < 0.005s hidden.  Use -vv to show these durations.)
=================================================================================== short test summary info ===================================================================================
FAILED mne/io/tests/test_constants.py::test_constants - AssertionError: coil: 402 (FIFFV_COIL_EYETRACK_PUPIL), FIFFV_COIL_EYETRACK_PUPIL
================================================================================= 1 failed, 6 passed in 2.58s =================================================================================

@scott-huberty
Copy link
Contributor

hi @scott-huberty great news, happy to hear youre interested to help! i didnt find much time to work on this after the sprint, so any push is welcome.

the code so far is definitely work in progress. documentation is non existent, as you realised :)

did you run into any problems or errors so far?

No worries @dominikwelke , I knew that this was a feature in development, so had no unrealistic expectations for it to work for me "out of the box" : )

I hit a couple of small errors that were pretty easy to fix. It looks like the test data you used was in binocular mode. Our test data was collected using monocular mode + remote mode. the asc files will look slightly different depending on what mode was used, So it will be good to test the code with these different types of data, so we can catch the breakpoints and refactor the code where needed!

I'll create my own branch based on this branch, and give you more details on what I've changed, once it a bit more organized!

@nmarkowitz
Copy link
Contributor

Hi, I'm also very interested in this development. I saw that you created a new type of object called RawEyelink. It may make more sense to just have your read_raw_eyelink() function and have a general class for eyetracking data (something like RawEyetracking) as there are many different eyetrackers out there and not all of them have a file format per se. For instance, I use a Tobii eyetracker and that data can be saved within the Matlab or Python session to a mat file, pickle file, hdf5 file (as in Psychopy) and any other number of file formats. I can potentially also offer help in this.

@larsoner
Copy link
Member

and have a general class for eyetracking data (something like RawEyetracking) as there are many different eyetrackers out there and not all of them have a file format per se

Well in theory (based on how BaseRaw works) RawEyelink should only subclass the __init__ and _read_segment_file methods. There shouldn't be anything else in the class, for example specific to eye tracking. This stuff should live in new helper functions for example in mne/preprocessing/eyetracking/*.py. So to me having RawEyelink for Eyelink data, then another class for some other manufacturer's data, etc. is a reasonable way to go.

@scott-huberty
Copy link
Contributor

and have a general class for eyetracking data (something like RawEyetracking) as there are many different eyetrackers out there and not all of them have a file format per se

Well in theory (based on how BaseRaw works) RawEyelink should only subclass the __init__ and _read_segment_file methods. There shouldn't be anything else in the class, for example specific to eye tracking. This stuff should live in new helper functions for example in mne/preprocessing/eyetracking/*.py. So to me having RawEyelink for Eyelink data, then another class for some other manufacturer's data, etc. is a reasonable way to go.

Agreed. We could also differentiate the reader functions by revising themne/io/eyetrack/* directory to have a folder for each system (Eyelink, tobii, etc), where reader functions for those systesm exist, just as MNE does for EEG/MEG data.

@sportnoah14 , Thanks for reaching out! Question, are you collecting Tobii Eyetracking data simultaneously with EEG or MEG? IMO part of the motivation for building this feature for Eyelink systems, was to be able to co-register the eyetracking x/y gaze traces with EEG/MEG traces (SR Research Eyelink systems can be integrated with EEG or MEG systems, which some of us are using).

Anyways, I'm hoping we can have this branch more developed by the end of September. Maybe it makes more sense to start thinking about adding a reader for the Tobii system at that point, when we'll hopefully have a more solid template for the Eyetracking class in MNE?

@dominikwelke
Copy link
Contributor Author

dominikwelke commented Aug 16, 2022

So to me having RawEyelink for Eyelink data, then another class for some other manufacturer's data, etc. is a reasonable way to go.

yes @sportnoah14 , i simply followed the MNE design for other data types to have manufacturer specific RawClasses. as @larsoner already said, these classes arent much more than differently labelled shells to package array data loaded with helper functions, in this case read_raw_eyelink.

..but actually, i shared your intuition and had even started the codebase using a generic class.. you can still see it in the well maintained docstring :D

    Returns
    -------
    raw : instance of RawEyetrack
        A Raw object containing eyetracker data.

We could also differentiate the reader functions by revising the mne/io/eyetrack/* directory to have a folder for each system (Eyelink, tobii, etc), where reader functions for those systesm exist, just as MNE does for EEG/MEG data.

yes, @scott-huberty - shifting code around might make sense as soon as things are a bit more settled and functionality grows.
for a start i thought the amount of io-code would be reasonably small to stay in one file (io/eyetrack/eyetrack.py), even if we add additional manufacturer specific functions and classes like read_raw_tobii / RawTobii.

for the API it doesn't matter as the users dont need to know where code sits and already import the reader function as from mne.io import read_raw_eyelink.
eyetracking specific preprocessing functions sit in mne.preprocessing.eyetracking - these are supposed to work with future additional eyetracker classes too


so tl;dr - happy to reorganize code when it makes sense, but right now i think we can focus on the functionality :)

@dominikwelke
Copy link
Contributor Author

dominikwelke commented Aug 16, 2022

Anyways, I'm hoping we can have this branch more developed by the end of September. Maybe it makes more sense to start thinking about adding a reader for the Tobii system at that point, when we'll hopefully have a more solid template for the Eyetracking class in MNE?

@sportnoah14 - i think if you were motivated there's no need to wait and you could already start writing a reader function for your files (read_raw_tobii ) .

the function would need to extract:

  • the sample data (x, y, pupil..), as numpy.array
  • events/annotations (i guess there are some?) for synchronization etc
  • and relevant recording info from the header, s.a. sampling frequency etc.

these things are the minimum we need to generate a raw object from the data :)

@nmarkowitz
Copy link
Contributor

@dominikwelke I could start writing something a basic io function. Though I don't save tobii data in its native format (if it has one) but rather save it through Matlab or Python (Psychopy) sdk. So I can start by creating a class to handle data from tobii. Many labs do this with tobii that run psychophysics experiments. Maybe one of the most useful things to do would be to start creating an io function for eyetracking data saved via Psychopy as it saves the data in a standardized file format and handles many different types of eyetrackers.

I also think having a generic class would be great for eyetracking and everything could be built from that as there are many types and companies for eyetracking and some record additional things besides xy position on screen and pupil size.

@scott-huberty I'm recording eyetracking simultaneously with eeg data. However, it isn't recorded on the same acquisition system so they have to be synced by TTL pulses or some other method first.

@larsoner, question for you. Eyetrackers sometimes encounter an error and may not record data for a few seconds leading to unevenly sampled data. What's the best way to handle unevenly sampled data? I.e. data in which it's better to use timestamps rather than a start time and sampling rate (which is assumed to be consistent during the whole recording).

I've been looking into eyetracking functions and this package may be a good reference for types of functions to incorporate and how to write them https://github.com/titoghose/PyTrack .

@larsoner
Copy link
Member

Eyetrackers sometimes encounter an error and may not record data for a few seconds leading to unevenly sampled data. What's the best way to handle unevenly sampled data? I.e. data in which it's better to use timestamps rather than a start time and sampling rate (which is assumed to be consistent during the whole recording).

We have no capability to handle unevenly sampled data in our base classes, and adding it would probably be a pain. I would use some simple resampling (e.g., linear interp) to resample to an even sample rate.

scott-huberty added a commit to scott-huberty/mne-python that referenced this pull request Sep 9, 2022
scott-huberty added a commit to scott-huberty/mne-python that referenced this pull request Sep 9, 2022
if 'datetime_str' in locals():
meas_date = dt.datetime.strptime(datetime_str,
'%a %b %d %H:%M:%S %Y')
meas_date = meas_date.replace(tzinfo=dt.timezone.utc)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dominikwelke I'm not sure that this is correct. The date/time string in the .ASC file header appears to be naive (no timezone info). Using replace tzinfo just forces the timezone string to be 'UTC'. This would only be correct if that naive date/time string is already expressed in UTC, and not a local timezone. We should check with Eyelink (I have been in touch with them and can send an email).

My other concern is that this date/time string generally won't accurately report the date/time of the session. This date/time string is taken from the Host PC (that runs on the homebrewed QNX OS), which is never connected to the internet. At least in our lab here, the date/time reported in the .asc files is very innacurate. Is that the case for the .asc files in your lab?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My other concern is that this date/time string generally won't accurately report the date/time of the session.

yes, timezones are a mess in my setup too

it is the only thing available, though, isnt it?
i think this software should extract the data thats in the header. necessary changes can then be done manually by the users.
alternatively, we had to pass additional information as arguments to the reader function, and i wouldnt want that..

The date/time string in the .ASC file header appears to be naive (no timezone info). Using replace tzinfo just forces the timezone string to be 'UTC'. This would only be correct if that naive date/time string is already expressed in UTC, and not a local timezone.

you cought me ;)
this was quick and dirty coding to get a working timezoned stamp from the header.

  1. i wasnt sure we even have the neccessary info for setting the accurate timezone in the header ('where' the dataset was recorded) - yes, asking SR what timezone the stamp in the header is stored in makes sense
  2. given the general level of doubt in the timestamp (see above) i didnt care too much, in the first place.
    when aligning the data with other recordings (e.g EEG) we need better data anyway (e.g. shared events), and then the most trusted timestamp can be used for a merged recording.
    and if the dataset stands for itself, i tend to think super accurate times are not important - the relative time of recordings is reliable (session 1 was X hours before session 2), and when you want to publish the dataset some anonymization algorithms (like the one in mne-bids) would change the timestamps anyway

of course this is only my opinion, happy to make the time stamp extraction better if there is a good way :)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree that we don't want to ask users to specify a timezone via an argument parameter - too many opportunities for mistakes. But if we want to set the meas_date, we'll have to specify a time zone or mne will return an error. IMO not setting a meas_date is preferable to setting an incorrect one. Anyway, I think the meas_date here is primarily metadata, I don't think we will need it for co-registration with EEG/MEG.. We will find out when we start working on that!

If you send me your email on discord, I can put you in CC in my emails with SR. I'll ask them if there is a good way to discern the timezone in the .asc files.

pupil=pupil,
read_eye_events=read_eye_events)
elif ftype == 'edf':
raise NotImplementedError('Eyelink .edf files not supported, yet')
Copy link
Contributor

@scott-huberty scott-huberty Sep 10, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I removed this line - IMO I don't think support for reading eyelink data files (.edf) directly is feasible. This would require the user to have the Eyelink SDK properly installed on their machine, and the MNE-Python codebase would need wrapper functions to execute the eyelink SDK binary files (and these functions would need to infer the user machine's OS, and act accordingly). My understanding is that this is something that can be very difficult to maintain, it is probably more simple for users to convert their files to .asc format with the tools eyelink provides. but feel free to ask the core devs, if you disagree.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

At some point pyeparse supported EDF I think with this code:

https://github.com/pyeparse/pyeparse/tree/master/pyeparse/edf

If it still works we could try using it. But it can be in a follow-up PR since it might indeed be a pain, and adding ASCII support is already a good first step.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@larsoner Oh okay maybe I was mistaken then, sounds good!

Copy link
Contributor Author

@dominikwelke dominikwelke Sep 20, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@scott-huberty -
i inserted the ftypeswitch for the future (not this PR).

as @larsoner mentioned, there is the pyeparse codebase that can in general read eyelink edf files (though when i tested it, it seemed - if at all - to only work with monocular recordings) .

it requires users to install the eyelink sdk, but i dont see this necessarily as a problem. converting to asc also requires downloading an extra software from the SR research forum

Copy link
Contributor

@scott-huberty scott-huberty Sep 20, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My concern wasn't having the user use another software (like using eyelinks edf2asc converter) so much as having to maintain a wrapper_function in mne that accesses eyelink's API. But if the pyparse code is already there, maybe it's not so bad as I thought!

Weird that it only works with monocualr recordings. I have some monocular recordings I can test out with it.

# transpose to correct sfreq
# samples = df_samples['tSample'].apply(lambda x: x*sfreq/1000.)
# first_sample = samples.min()
shiftfun = lambda a, b: a + b - (a % b) if (a % b != 0) else a
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure what this purpose of this lambda function was? Anyways I think it would only work as expected with data that have a 500Hz sampling frequency. With a 1000Hz frequency, every other sample would have it's timestamp shifted by 1 ms, and this would create duplicate timestamps.

testing_path = data_path(download=False)
fname = op.join(testing_path, 'eyetrack', 'test_eyelink.asc')

raw = read_raw_eyelink(fname, interpolate_missing=True, annotate_missing=True)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

was interpolate_missing=True and annotate_missing=True working for you? I think I may have broke someting becuase I get a circular import error when trying to import your functions from mne.preprocessing!

@dominikwelke
Copy link
Contributor Author

hi @scott-huberty et al.

sorry for being absent, i had conference travel and other commitments. ill try to set aside some time to cowork on this feature in the future :)

@scott-huberty
Copy link
Contributor

hi @scott-huberty et al.

sorry for being absent, i had conference travel and other commitments. ill try to set aside some time to cowork on this feature in the future :)

No worries, I hope you enjoyed your travels!

scott-huberty added a commit to scott-huberty/mne-python that referenced this pull request Sep 20, 2022
…0855,adding eyetracking channel info to mne/io/constants, and handled the merge conflict it created with main [ci skip] [skip azp] [skip actions]
scott-huberty added a commit to scott-huberty/mne-python that referenced this pull request Sep 20, 2022
…ols#10855, which added eyetracking channel info to mne/channels/channels.py and mne/defaults.py, and mne/io/pick.py [ci skip] [skip azp]
@larsoner
Copy link
Member

Closing for #11152

@larsoner larsoner closed this Mar 16, 2023
larsoner pushed a commit that referenced this pull request Mar 27, 2023
larsoner added a commit to cbrnr/mne-python that referenced this pull request Apr 21, 2023
* upstream/main: (50 commits)
  BUG: Fix bug with paths (mne-tools#11639)
  MAINT: Report download time and size (mne-tools#11635)
  MRG: Allow retrieval of channel names via make_1020_channel_selections() (mne-tools#11632)
  Fix index name in to_data_frame()'s docstring (mne-tools#11457)
  MAINT: Use VTK prerelease wheels in pre jobs (mne-tools#11629)
  ENH: Allow gradient compensated data in maxwell_filter (mne-tools#10554)
  make test compatible with future pandas (mne-tools#11625)
  Display SVG figures correctly in Report (mne-tools#11623)
  API: Port ieeg gui over to mne-gui-addons and add tfr gui example (mne-tools#11616)
  MAINT: Add token [ci skip] (mne-tools#11622)
  API: One cycle of backward compat (mne-tools#11621)
  MAINT: Use git rather than zipball (mne-tools#11620)
  ENH: Speed up code a bit (mne-tools#11614)
  [BUG, MRG] Don't modify info in place for transform points (mne-tools#11612)
  [BUG, MRG] Fix topomap extra plot generated, add util to check a range (mne-tools#11607)
  ENH: Add mne-bids-pipeline to mne sys_info (mne-tools#11606)
  MAINT: `coding: utf-8` is implicit in Python 3 (mne-tools#11599)
  ENH: Read eyetracking data (Eyelink) (Fork of mne-tools#10855 ) (mne-tools#11152)
  MAINT: In Python 3, do not prefix literals with `u` (mne-tools#11604)
  MAINT: object is an implicit base for all classes (mne-tools#11601)
  ...
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

ENH: basic support for eye tracker data
5 participants