-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
basic eyetracker functionality #10855
Conversation
- create new raw class for et data recorded with SR research's eyelink system - can read eyelink .asc files (using djangrew's ParseEyelinkAscFiles function - https://github.com/djangraw/ParseEyeLinkAscFiles/blob/master/ParseEyeLinkAsc.py) - tbd: adding annotations, create info from file-header
- fill missing data in epoched recording with nan - work on parsing errors exclude rows that have negative time values
- add to mne.io.__init__.py - add eyetrack as option to create_info() docstring
@dominikwelke FYI I've just tested the current state of the PR with an environment that uses scipy==1.8.0 and it worked fine. So I think we can chalk up the previous failure to "something went wrong with your environment" and not worry about what exactly it was. Also here is what I see when plotting: I think we need to work on scalings next, so that the traces are a bit more intelligible. See here: Lines 28 to 42 in e49525c
|
sounds good! x-y coordinates are in screen pixels (in my case), but they might also come as cm or degrees of visual angle with other systems.. pupil area is in arbitrary values. not sure what other manufacturers put. |
Once mne-tools/fiff-constants#39 is merged, then here: mne-python/mne/io/constants.py Lines 182 to 226 in 386d4cf
|
Don't wait for fiff-constants. Change the lines at the top of |
like this @larsoner ? |
Pull the tiny commit I just made then locally do:
and see if it passes -- let |
111f037
to
54e3abb
Compare
nope, doesnt pass.. here the error log:
|
No worries @dominikwelke , I knew that this was a feature in development, so had no unrealistic expectations for it to work for me "out of the box" : ) I hit a couple of small errors that were pretty easy to fix. It looks like the test data you used was in binocular mode. Our test data was collected using monocular mode + remote mode. the asc files will look slightly different depending on what mode was used, So it will be good to test the code with these different types of data, so we can catch the breakpoints and refactor the code where needed! I'll create my own branch based on this branch, and give you more details on what I've changed, once it a bit more organized! |
Hi, I'm also very interested in this development. I saw that you created a new type of object called |
Well in theory (based on how |
Agreed. We could also differentiate the reader functions by revising the @sportnoah14 , Thanks for reaching out! Question, are you collecting Tobii Eyetracking data simultaneously with EEG or MEG? IMO part of the motivation for building this feature for Eyelink systems, was to be able to co-register the eyetracking x/y gaze traces with EEG/MEG traces (SR Research Eyelink systems can be integrated with EEG or MEG systems, which some of us are using). Anyways, I'm hoping we can have this branch more developed by the end of September. Maybe it makes more sense to start thinking about adding a reader for the Tobii system at that point, when we'll hopefully have a more solid template for the |
yes @sportnoah14 , i simply followed the MNE design for other data types to have manufacturer specific RawClasses. as @larsoner already said, these classes arent much more than differently labelled shells to package array data loaded with helper functions, in this case ..but actually, i shared your intuition and had even started the codebase using a generic class.. you can still see it in the well maintained docstring :D
yes, @scott-huberty - shifting code around might make sense as soon as things are a bit more settled and functionality grows. for the API it doesn't matter as the users dont need to know where code sits and already import the reader function as so tl;dr - happy to reorganize code when it makes sense, but right now i think we can focus on the functionality :) |
@sportnoah14 - i think if you were motivated there's no need to wait and you could already start writing a reader function for your files ( the function would need to extract:
these things are the minimum we need to generate a raw object from the data :) |
@dominikwelke I could start writing something a basic io function. Though I don't save tobii data in its native format (if it has one) but rather save it through Matlab or Python (Psychopy) sdk. So I can start by creating a class to handle data from tobii. Many labs do this with tobii that run psychophysics experiments. Maybe one of the most useful things to do would be to start creating an io function for eyetracking data saved via Psychopy as it saves the data in a standardized file format and handles many different types of eyetrackers. I also think having a generic class would be great for eyetracking and everything could be built from that as there are many types and companies for eyetracking and some record additional things besides xy position on screen and pupil size. @scott-huberty I'm recording eyetracking simultaneously with eeg data. However, it isn't recorded on the same acquisition system so they have to be synced by TTL pulses or some other method first. @larsoner, question for you. Eyetrackers sometimes encounter an error and may not record data for a few seconds leading to unevenly sampled data. What's the best way to handle unevenly sampled data? I.e. data in which it's better to use timestamps rather than a start time and sampling rate (which is assumed to be consistent during the whole recording). I've been looking into eyetracking functions and this package may be a good reference for types of functions to incorporate and how to write them https://github.com/titoghose/PyTrack . |
We have no capability to handle unevenly sampled data in our base classes, and adding it would probably be a pain. I would use some simple resampling (e.g., linear interp) to resample to an even sample rate. |
…[skip azp][skip actions]
…[skip azp][skip actions]
if 'datetime_str' in locals(): | ||
meas_date = dt.datetime.strptime(datetime_str, | ||
'%a %b %d %H:%M:%S %Y') | ||
meas_date = meas_date.replace(tzinfo=dt.timezone.utc) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@dominikwelke I'm not sure that this is correct. The date/time string in the .ASC file header appears to be naive (no timezone info). Using replace
tzinfo
just forces the timezone string to be 'UTC'. This would only be correct if that naive date/time string is already expressed in UTC, and not a local timezone. We should check with Eyelink (I have been in touch with them and can send an email).
My other concern is that this date/time string generally won't accurately report the date/time of the session. This date/time string is taken from the Host PC (that runs on the homebrewed QNX OS), which is never connected to the internet. At least in our lab here, the date/time reported in the .asc files is very innacurate. Is that the case for the .asc files in your lab?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My other concern is that this date/time string generally won't accurately report the date/time of the session.
yes, timezones are a mess in my setup too
it is the only thing available, though, isnt it?
i think this software should extract the data thats in the header. necessary changes can then be done manually by the users.
alternatively, we had to pass additional information as arguments to the reader function, and i wouldnt want that..
The date/time string in the .ASC file header appears to be naive (no timezone info). Using replace tzinfo just forces the timezone string to be 'UTC'. This would only be correct if that naive date/time string is already expressed in UTC, and not a local timezone.
you cought me ;)
this was quick and dirty coding to get a working timezoned stamp from the header.
- i wasnt sure we even have the neccessary info for setting the accurate timezone in the header ('where' the dataset was recorded) - yes, asking SR what timezone the stamp in the header is stored in makes sense
- given the general level of doubt in the timestamp (see above) i didnt care too much, in the first place.
when aligning the data with other recordings (e.g EEG) we need better data anyway (e.g. shared events), and then the most trusted timestamp can be used for a merged recording.
and if the dataset stands for itself, i tend to think super accurate times are not important - the relative time of recordings is reliable (session 1 was X hours before session 2), and when you want to publish the dataset some anonymization algorithms (like the one in mne-bids) would change the timestamps anyway
of course this is only my opinion, happy to make the time stamp extraction better if there is a good way :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree that we don't want to ask users to specify a timezone via an argument parameter - too many opportunities for mistakes. But if we want to set the meas_date
, we'll have to specify a time zone or mne
will return an error. IMO not setting a meas_date
is preferable to setting an incorrect one. Anyway, I think the meas_date
here is primarily metadata, I don't think we will need it for co-registration with EEG/MEG.. We will find out when we start working on that!
If you send me your email on discord, I can put you in CC in my emails with SR. I'll ask them if there is a good way to discern the timezone in the .asc files.
pupil=pupil, | ||
read_eye_events=read_eye_events) | ||
elif ftype == 'edf': | ||
raise NotImplementedError('Eyelink .edf files not supported, yet') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I removed this line - IMO I don't think support for reading eyelink data files (.edf) directly is feasible. This would require the user to have the Eyelink SDK properly installed on their machine, and the MNE-Python codebase would need wrapper functions to execute the eyelink SDK binary files (and these functions would need to infer the user machine's OS, and act accordingly). My understanding is that this is something that can be very difficult to maintain, it is probably more simple for users to convert their files to .asc format with the tools eyelink provides. but feel free to ask the core devs, if you disagree.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
At some point pyeparse supported EDF I think with this code:
https://github.com/pyeparse/pyeparse/tree/master/pyeparse/edf
If it still works we could try using it. But it can be in a follow-up PR since it might indeed be a pain, and adding ASCII support is already a good first step.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@larsoner Oh okay maybe I was mistaken then, sounds good!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@scott-huberty -
i inserted the ftype
switch for the future (not this PR).
as @larsoner mentioned, there is the pyeparse
codebase that can in general read eyelink edf files (though when i tested it, it seemed - if at all - to only work with monocular recordings) .
it requires users to install the eyelink sdk, but i dont see this necessarily as a problem. converting to asc also requires downloading an extra software from the SR research forum
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My concern wasn't having the user use another software (like using eyelinks edf2asc converter) so much as having to maintain a wrapper_function in mne
that accesses eyelink's API. But if the pyparse code is already there, maybe it's not so bad as I thought!
Weird that it only works with monocualr recordings. I have some monocular recordings I can test out with it.
# transpose to correct sfreq | ||
# samples = df_samples['tSample'].apply(lambda x: x*sfreq/1000.) | ||
# first_sample = samples.min() | ||
shiftfun = lambda a, b: a + b - (a % b) if (a % b != 0) else a |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure what this purpose of this lambda function was? Anyways I think it would only work as expected with data that have a 500Hz sampling frequency. With a 1000Hz frequency, every other sample would have it's timestamp shifted by 1 ms, and this would create duplicate timestamps.
testing_path = data_path(download=False) | ||
fname = op.join(testing_path, 'eyetrack', 'test_eyelink.asc') | ||
|
||
raw = read_raw_eyelink(fname, interpolate_missing=True, annotate_missing=True) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
was interpolate_missing=True
and annotate_missing=True
working for you? I think I may have broke someting becuase I get a circular import error when trying to import your functions from mne.preprocessing
!
hi @scott-huberty et al. sorry for being absent, i had conference travel and other commitments. ill try to set aside some time to cowork on this feature in the future :) |
No worries, I hope you enjoyed your travels! |
…0855,adding eyetracking channel info to mne/io/constants, and handled the merge conflict it created with main [ci skip] [skip azp] [skip actions]
…ols#10855, which added eyetracking channel info to mne/channels/channels.py and mne/defaults.py, and mne/io/pick.py [ci skip] [skip azp]
c6007b6
to
c1470b7
Compare
Closing for #11152 |
Co-authored-by: dominikwelke <[email protected]>
* upstream/main: (50 commits) BUG: Fix bug with paths (mne-tools#11639) MAINT: Report download time and size (mne-tools#11635) MRG: Allow retrieval of channel names via make_1020_channel_selections() (mne-tools#11632) Fix index name in to_data_frame()'s docstring (mne-tools#11457) MAINT: Use VTK prerelease wheels in pre jobs (mne-tools#11629) ENH: Allow gradient compensated data in maxwell_filter (mne-tools#10554) make test compatible with future pandas (mne-tools#11625) Display SVG figures correctly in Report (mne-tools#11623) API: Port ieeg gui over to mne-gui-addons and add tfr gui example (mne-tools#11616) MAINT: Add token [ci skip] (mne-tools#11622) API: One cycle of backward compat (mne-tools#11621) MAINT: Use git rather than zipball (mne-tools#11620) ENH: Speed up code a bit (mne-tools#11614) [BUG, MRG] Don't modify info in place for transform points (mne-tools#11612) [BUG, MRG] Fix topomap extra plot generated, add util to check a range (mne-tools#11607) ENH: Add mne-bids-pipeline to mne sys_info (mne-tools#11606) MAINT: `coding: utf-8` is implicit in Python 3 (mne-tools#11599) ENH: Read eyetracking data (Eyelink) (Fork of mne-tools#10855 ) (mne-tools#11152) MAINT: In Python 3, do not prefix literals with `u` (mne-tools#11604) MAINT: object is an implicit base for all classes (mne-tools#11601) ...
new raw class for eyetracking data
with reader functions for various dataformats
alignment and merge functionality with other MNE objects planned
dev and testing using data recorded with SR research eyelink 1000+
closes #10751
see also mne-tools/fiff-constants#39