-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ZeroMQ stream class #34
Conversation
pluma/examples/load_zeromq.ipynb
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we maybe move this into emotional-cities/notebooks? It would be best to keep this repo without references to specific datasets.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes good point, will remove it from this PR and shift it into notebooks.
pluma/schema/outdoor.py
Outdated
# Pupil streams | ||
streams.PupilLabs.Counter.DecodedFrames = HarpStream(209, device='PupilLabs', streamlabel='Counter_DecodedFrames', root=root, autoload=autoload, parent_dataset=parent_dataset) | ||
streams.PupilLabs.Counter.RawFrames = HarpStream(210, device='PupilLabs', streamlabel='Counter_RawFrames', root=root, autoload=autoload, parent_dataset=parent_dataset) | ||
streams.PupilLabs.Data.RawFrames = PupilStream([('Format', np.uint32), ('Width', np.uint32), ('Height', np.uint32), ('Sequence', np.uint32), ('Timestamp', np.uint64), ('DataBytes', np.uint32), ('Reserved', np.uint32)], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For consistency, we should avoid having any details of the file formats coming in at the level of the schema, so perhaps we would need to have even more specialized classes for the different types, e.g. PupilRawFrameStream
, PupilGazeStream
, GliaEyeTrackingStream
, GliaHeartRateStream
, etc.
Ideally most of the things in the schema would describe which devices were used for which experiment and potentially where they come from, e.g. the folder name or register number for Harp, etc., but without having to go as far as specifying data types and labels for each output column.
For harp streams and empatica streams this was made by hard-coding the format directly into the IO file readers. In this case because we have generic classes in the backend it would be easier to just derive classes from them, but keep the overall approach.
Summary
This PR addresses issues #9 and #10 and introduces a zeromq stream reader that can be also used to read pupil invisible and glia data streams.
In general in pluma experiments, NetMQ messages are logged as separate binary files for each frame within the message. To decode these binary files, the ZeromqStream class can be used to provide the names of each of those binary files and a data type definition to unpack the flat binary.
Pupil specific
Pupil data is received and logged in Bonsai via NetMQMessages, each with 3 NetMQ frames according to the NDSI spec: https://github.com/pupil-labs/pyndsi/blob/v1.0/ndsi-commspec.md
The PupilStream class extracts data according to this specification where the frames are saved as individual binary streams (e.g. Gaze_Frame0.bin, Gaze_Frame1.bin, Gaze_Frame2.bin). The PupilStream class is introduced as a subclass of the ZeromqStream.
Schemas / Examples
This PR also modifies the vr and outdoor schemas to reflect the new stream classes. An example notebook showing the use of the new stream types is also included.