A set of files used for parsing edf files generated by an SR Research EyeLink and their edf2asc conversion program. This was largely a tutorial project for me, and is unlikely to be of much use to others, however I do give a description of how one might use it in case anyone wants to learn from it or adapt it. In general, the options of edf2asc will do much of what I do programmatically here, and most languages used for data analysis (MATLAB, Python, R) would probably be a better choice for any further post-processing needed.
Haskell has many excellent parsing libraries to choose from, and most are documented in blog posts or introductory books on Haskell programming. However, I came across this post on using functions available in base Haskell and figured that it might be an easy to get started. It was, but it was not a particularly easy way to get finished. But it did the trick, and for something like this, simple text, and only a few basic types of data to filter out and pack up, it will probably work for you too.
SR Research makes excellent eye trackers (as do others). One results of using their trackers is a binary file saved with an extension “.edf”. SR Reseach provides other software that works with that file directly, but they also provide a conversion program to generate ascii files from this binary verison. Run without options you get a largish file with individual lines that report time stamps and the location of the tracked eyes. You also get various events as the eye tracker does on-line classification of things like blinks, saccades and fixations. You can find details about these codes in the manuals available from SR Research. Another option available to you when using one of these eye trackers is to send messages from a stimulus computer to the eye tracker over ethernet that results in your time stamped messages being inserted into the EDF file so that you can tag events like when a trial started or when an experimental event occured.
In my case the goal was a file suitable for import to R where I planned to do my statistical analyses of the behavioral and eye tracking data, and ideally to do so in a reproducible format combining the analyses with the text for the manuscript reporting them. Those files will be posted elsewhere in the future, but this also serves as a repository for the code used to parse the ascii converted data files.
After working a bit with the goal of producing a csv file from my parsing I decided to move to json formats. I leave some of the files from the csv ideas as bread crumbs for anyone else who wishes to try.
One of the things I like about haskell are the types, and it made sense to me to think of a particular participant trial as a type, and the json format allowed me to preserve that intuition when producing objects that were relatively easy to import into R. There are some helper functions and packages needed for doing that, and those are not included here. I only include here the haskell code used for going from an ascii version of the raw data file (generated by edf2asc
) to a json file containing the objects that I wanted for my purposes and useful for my protocols. With what is here it should be relatively straightforward to expand to other types of data events or different collections of edf output data events.
Don’t use the executable. It was an idea that I had that I would run this from the command line on a directory of eligible files, but by the time I had gotten done debugging and testing, I had pretty much already converted all in the process from a repl. The Main.hs here does show a beginning for how this could work.
I use cabal new
style commands. To compile the library and executable simply clone the repo, move into the top directory and run cabal new-build
. This presumes of course that you have the necessary versions of haskell installed. And if you don’t, or that seems puzzling, don’t bother playing with this repo, you have bigger fish to fry. I use Linux and I have only used this on computers running Linux.
I have included a small and large data file for testing and playing in the data subdirectory. Run cabal new-repl lib:parse
and then in the repl run :m +TrialJSON
. Now you have all the exported functions for parsing a made up line or an entire file available in your repl for playing and testing. If you want to see if all works you can delete the files in the parseData
subdirectory (but not the subdirectory itself) and run:
edfViaAsc2Json "./data/12exp2_7ed5d76a-9038-45fb-9b44-e7236b04805c_.asc"
This should generate an intermediate parsed data file and then a subsequent conversion to a json file that can be imported into R.
Please see the excellent blog post listed above. It has the basic information you need, and your sweat, blood and tears will provide the necessary lubrication to get things running.