imhr 2019-06-02.0.3.37
  • Glossary
  • Install
  • Site
      • Introduction
      • Installation
        • Dependencies
        • Installing
        • Testing
      • Features
        • Accesesing Raw Data
        • Preprocessing
        • Data Analysis
        • Relational plots
      • Guides
        • Process demographics
          • get metadata
          • prepare data
          • demographic statistics
          • list of variables
          • descriptive device
          • descriptive task
          • summary data
        • Download data
          • imports
          • raw data (UTWeb)
          • REDCap
          • box
        • Run models
        • Create plots
        • PsychoPy Builder
          • Before Running the Participant
        • Eyetracking demo
          • Before Running the Participant
            • Import package
            • Initialize imhr.eyetracking.Eyelink
            • Connect to the Eyelink Host
          • Preparing the Participant
            • Set the dominant eye
            • Start calibration
            • (Optional) Print message to console/terminal
            • (Optional) Drift correction
          • Task Starting
            • Start recording
            • (Optional) Initiate gaze contigent event
            • (Optional) Collect real-time gaze coordinates from Eyelink
            • (Optional) Send messages to Eyelink
          • Task Ending
            • Stop recording
            • Finish recording
            • Close PsychoPy
            • Other Resources
              • Calculate Visual Angle
        • Region of Interest
          • Introduction
            • Example
          • Parameters
            • Required Parameters
            • Optional Parameters
          • Creating ROI
            • Manual coding
            • Haar Cascades
          • Output
            • Bounds
            • Data
      • Studies
        • R33
        • Webgazer
      • Tasks
      • Dashboard
      • API Reference
        • imhr.data
        • imhr.eyetracking
        • imhr.r33
        • imhr.tests
          • imhr.tests.test_basic
          • imhr.tests.test_roi
          • imhr.tests.test_eyetracking
        • imhr.settings
        • imhr.Webgazer
  • Page
      • Introduction
      • Installation
      • Features
      • Guides
      • Studies
      • Tasks
      • Dashboard
      • API Reference
  • Introduction
  • Installation
  • Features
  • Guides
  • Studies
  • Tasks
  • Dashboard
  • API Reference
Source

imhr.eyetracking._eyelink¶

@purpose: Interface for the SR Research Eyelink eyetracking system.
@date: Created on Wed Feb 13 15:37:43 2019
@author: Semeon Risom
@email: semeon.risom@gmail.com
@url: https://semeon.io/d/imhr

Classes

Eyelink(window, timer[, isPsychopy, subject]) Interface for the SR Research Eyelink eyetracking system.
class imhr.eyetracking._eyelink.Eyelink(window, timer, isPsychopy=True, subject=None, **kwargs)[source]¶

Bases: object

Interface for the SR Research Eyelink eyetracking system.

Methods

Methods

calibration(self) Start calibration procedure.
connect(self[, calibration_type, …]) Connect to Eyelink.
drift_correction(self[, origin]) Starts drift correction.
finish_recording(self[, path]) Ending Eyelink recording.
gc(self, bound, min_[, max_]) Creates gaze contigent event.
sample(self) Collects new gaze coordinates from Eyelink.
send_message(self, msg) Send message to Eyelink.
send_variable(self, variables) send trial variable to eyelink at the end of trial.
set_eye_used(self, eye) Set dominant eye.
start_recording(self, trial, block) Starts recording of Eyelink.
stop_recording(self[, trial, block, variables]) Stops recording of Eyelink.
calibration(self) Start calibration procedure.
connect(self[, calibration_type, …]) Connect to Eyelink.
drift_correction(self[, origin]) Starts drift correction.
finish_recording(self[, path]) Ending Eyelink recording.
gc(self, bound, min_[, max_]) Creates gaze contigent event.
sample(self) Collects new gaze coordinates from Eyelink.
send_message(self, msg) Send message to Eyelink.
send_variable(self, variables) send trial variable to eyelink at the end of trial.
set_eye_used(self, eye) Set dominant eye.
start_recording(self, trial, block) Starts recording of Eyelink.
stop_recording(self[, trial, block, variables]) Stops recording of Eyelink.
connect(self, calibration_type=13, automatic_calibration_pacing=1000, saccade_velocity_threshold=35, saccade_acceleration_threshold=9500, sound=True, select_parser_configuration=0, recording_parse_type='GAZE', enable_search_limits=True, ip='100.1.1.1')[source]¶

Connect to Eyelink.

Parameters:
ip : string

Host PC ip address.

calibration_type : int

Calibration type. Default is 13-point. [see Eyelink 1000 Plus User Manual, 3.7 Calibration]

automatic_calibration_pacing : int

Select the delay in milliseconds, between successive calibration or validation targets if automatic target detection is activeSet automatic calibration pacing. [see pylink.chm]

saccade_velocity_threshold : int

Sets velocity threshold of saccade detector: usually 30 for cognitive research, 22 for pursuit and neurological work. Default is 35. Note: EyeLink II and EyeLink 1000, select select_parser_configuration should be used instead. [see EyeLink Programmer’s Guide, Section 25.9: Parser Configuration; Eyelink 1000 Plus User Manual, Section 4.3.5 Saccadic Thresholds]

saccade_acceleration_threshold : int

Sets acceleration threshold of saccade detector: usually 9500 for cognitive research, 5000 for pursuit and neurological work. Default is 9500. Note: For EyeLink II and EyeLink 1000, select select_parser_configuration should be used instead. [see EyeLink Programmer’s Guide, Section 25.9: Parser Configuration; Eyelink 1000 Plus User Manual, Section 4.3.5 Saccadic Thresholds]

select_parser_configuration : int

Selects the preset standard parser setup (0) or more sensitive (1). These are equivalent to the cognitive and psychophysical configurations. Default is 0. [see EyeLink Programmer’s Guide, Section 25.9: Parser Configuration]

sound : bool

Should sound be used for calibration/validation/drift correction.

recording_parse_type : str

Sets how velocity information for saccade detection is to be computed. Enter either ‘GAZE’ or ‘HREF’. Default is ‘GAZE’. [see Eyelink 1000 Plus User Manual, Section 4.4: File Data Types]

enable_search_limits : bool

Enables tracking of pupil to global search limits. Default is True. [see Eyelink 1000 Plus User Manual, Section 4.4: File Data Types]

Returns:
param : pandas.DataFrame

Returns dataframe of parameters for subject.

Examples

>>> param = eyetracking.connect(calibration_type=13)
set_eye_used(self, eye)[source]¶

Set dominant eye. This step is required for recieving gaze coordinates from Eyelink->Psychopy.

Parameters:
eye : str

Dominant eye (left, right). This will be used for outputting Eyelink gaze samples.

Examples

>>> dominant_eye = 'left'
>>> eye_used = eyetracking.set_eye_used(eye=dominant_eye)
calibration(self)[source]¶

Start calibration procedure.

Returns:
isCalibration : bool

Message indicating status of calibration.

Examples

>>> eyetracking.calibration()
drift_correction(self, origin='call')[source]¶

Starts drift correction. This can be done at any point after calibration, including before and after eyetracking.start_recording has already been initiated.

Parameters:
origin : str

Origin of call, either manual (default) or from gaze contigent event (gc).

Returns:
isDriftCorrection : bool

Message indicating status of drift correction.

Notes

Running drift_correction will end any start_recording event to function properly. Once drift correction has occured, it is safe to run start_recording.

Examples

>>> eyetracking.drift_correction()
start_recording(self, trial, block)[source]¶

Starts recording of Eyelink.

Parameters:
trial : str

Trial Number.

block : str

Block Number.

Returns:
isRecording : bool

Message indicating status of Eyelink recording.

Notes

tracker.beginRealTimeMode():
To ensure that no data is missed before the important part of the trial starts. The EyeLink tracker requires 10 to 30 milliseconds after the recording command to begin writing data. This extra data also allows the detection of blinks or saccades just before the trial start, allowing bad trials to be discarded in saccadic RT analysis. A “SYNCTIME” message later in the trial marks the actual zero-time in the trial’s data record for analysis. [see pylink.chm]
TrialID:
The “TRIALID” message is sent to the EDF file next. This message must be placed in the EDF file before the drift correction and before recording begins, and is critical for data analysis. The viewer will not parse any messages, events, or samples that exist in the data file prior to this message. The command identifier can be changed in the data loading preference settings. [see Data Viewer User Manual, Section 7: Protocol for EyeLink Data to Viewer Integration]
SYNCTIME:
Marks the zero-time in a trial. A number may follow, which is interpreted as the delay of the message from the actual stimulus onset. It is suggested that recording start 100 milliseconds before the display is drawn or unblanked at zero-time, so that no data at the trial start is lost. [see pylink.chm]

Examples

>>> eyetracking.start_recording(trial=1, block=1)
gc(self, bound, min_, max_=None)[source]¶

Creates gaze contigent event. This function needs to be run while recording.

Parameters:
bound : dict [str, int]:

Dictionary of the bounding box for each region of interest. Keys are each side of the bounding box and values are their corresponding coordinates in pixels.

min_ : int

Mininum duration (msec) in which gaze contigent capture is collecting data before allowing the task to continue.

max_ : int or None

Maxinum duration (msec) before task is forced to go into drift correction.

Examples

>>> # Collect samples within the center of the screen, for 2000 msec, 
>>> # with a max limit of 10000 msec.
>>> region = dict(left=860, top=440, right=1060, bottom=640)
>>> eyetracking.gc(bound=bound, min_=2000, max_=10000)
sample(self)[source]¶

Collects new gaze coordinates from Eyelink.

Returns:
gxy : tuple

Gaze coordiantes.

ps : tuple

Pupil size (area).

sample : EyeLink.getNewestSample

Eyelink newest sample.

Examples

>>> eyetracking.sample(eye_used=eye_used)
send_message(self, msg)[source]¶

Send message to Eyelink. This allows post-hoc processing of event markers (i.e. “stimulus onset”).

Parameters:
msg : str

Message to be recieved by eyelink.

Examples

>>> eyetracking.console(msg="eyetracking.calibration() started", color="blue")
send_variable(self, variables)[source]¶

send trial variable to eyelink at the end of trial.

Parameters:
variable : dict or None

Trial-related variables to be read by eyelink.

stop_recording(self, trial=None, block=None, variables=None)[source]¶

Stops recording of Eyelink. Also allows transmission of trial-level variables to Eyelink.

Parameters:
trial : int

Trial Number.

block : int

Block Number.

variables : dict or None

Dict of variables to send to eyelink (variable name, value).

Returns:
isRecording : bool

Message indicating status of Eyelink recording.

Notes

pylink.pumpDelay():
Does a unblocked delay using currentTime(). This is the preferred delay function when accurate timing is not needed. [see pylink.chm]
pylink.msecDelay():
During calls to pylink.msecDelay(), Windows is not able to handle messages. One result of this is that windows may not appear. This is the preferred delay function when accurate timing is needed. [see pylink.chm]
tracker.endRealTimeMode():
Returns the application to a priority slightly above normal, to end realtime mode. This function should execute rapidly, but there is the possibility that Windows will allow other tasks to run after this call, causing delays of 1-20 milliseconds. This function is equivalent to the C API void end_realtime_mode(void). [see pylink.chm]
TRIAL_VAR:
Lets users specify a trial variable and value for the given trial. One message should be sent for each trial condition variable and its corresponding value. If this command is used there is no need to use TRIAL_VAR_LABELS. The default command identifier can be changed in the data loading preference settings. Please note that the eye tracker can handle about 20 messages every 10 milliseconds. So be careful not to send too many messages too quickly if you have many trial condition messages to send. Add one millisecond delay between message lines if this is the case. [see pylink.chm]
TRIAL_RESULT:
Defines the end of a trial. The viewer will not parse any messages, events, or samples that exist in the data file after this message. The command identifier can be changed in the data loading preference settings. [see Data Viewer User Manual, Section 7: Protocol for EyeLink Data to Viewer Integration]

Examples

>>> variables = dict(stimulus='face.png', event='stimulus')
>>> eyetracking.stop_recording(trial=trial, block=block, variables=variables)
finish_recording(self, path=None)[source]¶

Ending Eyelink recording.

Parameters:
path : str or None

Path to save data. If None, path will be default from PsychoPy task.

Returns:
isFinished : bool

Message indicating status of Eyelink recording.

Notes

pylink.pumpDelay():
Does a unblocked delay using currentTime(). This is the preferred delay function when accurate timing is not needed. [see pylink.chm]
pylink.msecDelay():
During calls to pylink.msecDelay(), Windows is not able to handle messages. One result of this is that windows may not appear. This is the preferred delay function when accurate timing is needed. [see pylink.chm]
tracker.setOfflineMode():
Places EyeLink tracker in offline (idle) mode. Wait till the tracker has finished the mode transition. [see pylink.chm]
tracker.endRealTimeMode():
Returns the application to a priority slightly above normal, to end realtime mode. This function should execute rapidly, but there is the possibility that Windows will allow other tasks to run after this call, causing delays of 1-20 milliseconds. This function is equivalent to the C API void end_realtime_mode(void). [see pylink.chm]
tracker.receiveDataFile():
This receives a data file from the EyeLink tracker PC. Source filename and destination filename should be given. [see pylink.chm]

Examples

>>> #end recording session
>>> eyetracking.finish_recording()
© Copyright 2019,
Semeon Risom.
Updated 2019-06-02T17:29:49-05:00.