ETracker

ETracker(win, etracker_id=0, simulate=False, verbose=True)

A high-level controller for running eye-tracking experiments with Tobii Pro and PsychoPy.

The ETracker class is a simplified Python interface designed to streamline the process of running infant eye-tracking experiments. It acts as a bridge between the Tobii Pro SDK (version 3.0 or later) and the popular experiment-building framework, PsychoPy.

This class is the central hub for your eye-tracking experiment. Instead of managing low-level SDK functions, the TobiiController provides a clean, unified workflow for key experimental tasks. It is designed to “detoxify” the process, abstracting away complex boilerplate code so you can focus on your research.

Key features include: - Experiment Control: Start, stop, and manage eye-tracking recordings with simple method calls. - Data Management: Automatically save recorded gaze data to a specified file format. - Calibration: Easily run a calibration procedure or load an existing calibration file to prepare the eye-tracker. - Seamless Integration: Built specifically to integrate with PsychoPy’s experimental loop, making it a natural fit for your existing research designs.

This class is intended to be the first object you instantiate in your experiment script. It provides a minimal yet powerful set of methods that are essential for conducting a reliable and reproducible eye-tracking study.

Methods

Name Description
calibrate Run infant-friendly calibration procedure.
gaze_contingent Initialize real-time gaze buffer for contingent applications.
get_gaze_position Get current gaze position from rolling buffer.
load_calibration Loads calibration data from a file and applies it to the eye tracker.
record_event Record timestamped experimental event during data collection.
save_calibration Save the current calibration data to a file.
save_data Save buffered gaze and event data to file with optimized processing.
set_eyetracking_settings Configure and apply Tobii eye tracker settings.
show_status Real-time visualization of participant’s eye position in track box.
start_recording Begin gaze data recording session.
stop_recording Stop gaze data recording and finalize session.

calibrate

ETracker.calibrate(
    calibration_points,
    infant_stims,
    shuffle=True,
    audio=True,
    anim_type='zoom',
    visualization_style='circles',
)

Run infant-friendly calibration procedure.

Performs eye tracker calibration using animated stimuli to engage infant participants. The calibration establishes the mapping between eye position and screen coordinates, which is essential for accurate gaze data collection. Automatically selects the appropriate calibration method based on operating mode (real eye tracker vs. mouse simulation).

Parameters

Name Type Description Default
calibration_points list[tuple[float, float]] Target locations in PsychoPy coordinates (e.g., height units). Typically 5�9 points distributed across the screen. required
infant_stims list[str] Paths to engaging image files for calibration targets (e.g., animated characters, colorful objects). required
shuffle bool Whether to randomize stimulus presentation order. Default True. True
audio psychopy.sound.Sound | None Attention-getting sound to play during calibration. Default None. None
anim_type (zoom, trill) Animation style for the stimuli. Default ‘zoom’. 'zoom'
save_calib bool | str Controls saving of calibration after a successful run: - False: do not save (default) - True: save using default naming (timestamped) - str: save to this filename; if it has no extension, ‘.dat’ is added. False
num_samples int Samples per point in simulation mode. Default 5. 5

Raises

Name Type Description
bool True if calibration completed successfully, False otherwise.

Notes

  • Real mode uses Tobii’s calibration with result visualization.
  • Simulation mode uses mouse position to approximate the process.
  • If in simulation mode, any save request is safely skipped with a warning.

Examples

# Basic real-time gaze tracking
ET_controller.gaze_contingent(N=5)  # Initialize buffer
ET_controller.start_recording('data.h5')

# Create gaze-contingent stimulus
circle = visual.Circle(win, radius=0.05, fillColor='red')

for frame in range(600):  # 10 seconds at 60 fps
    gaze_pos = ET_controller.get_gaze_position()
    circle.pos = gaze_pos
    circle.draw()
    win.flip()

ET_controller.stop_recording()

# Adjust buffer size for your needs
ET_controller.gaze_contingent(N=3)   # Low latency, more jitter
ET_controller.gaze_contingent(N=10)  # Smooth, higher latency

# Gaze-contingent window paradigm
ET_controller.gaze_contingent(N=5)
ET_controller.start_recording('gaze_window.h5')

stimulus = visual.ImageStim(win, 'image.png')
window = visual.Circle(win, radius=0.1, fillColor=None, lineColor='white')

for trial in range(20):
    stimulus.draw()
    
    for frame in range(120):  # 2 seconds
        gaze_pos = ET_controller.get_gaze_position()
        window.pos = gaze_pos
        window.draw()
        win.flip()
    
    ET_controller.record_event(f'trial_{trial}_end')

ET_controller.stop_recording()

gaze_contingent

ETracker.gaze_contingent(N=5)

Initialize real-time gaze buffer for contingent applications.

get_gaze_position

ETracker.get_gaze_position(fallback_offscreen=True, method='median')

Get current gaze position from rolling buffer.

Aggregates recent gaze samples from both eyes to provide a stable, real-time gaze estimate. Handles missing or invalid data gracefully.

Parameters

Name Type Description Default
fallback_offscreen bool If True (default), returns an offscreen position (3x screen dimensions) when no valid gaze data is available. If False, returns None. True
method str Aggregation method for combining samples and eyes. - “median” (default): Robust to outliers, good for noisy data - “mean”: Smoother but sensitive to outliers - “last”: Lowest latency, uses only most recent sample 'median'

Returns

Name Type Description
tuple or None Gaze position (x, y) in PsychoPy coordinates (current window units), or None if no valid data and fallback_offscreen=False.

Raises

Name Type Description
RuntimeError If gaze_contingent() was not called to initialize the buffer.

Examples

>>> # Basic usage (median aggregation)
>>> pos = tracker.get_gaze_position()
>>> if pos is not None:
...     circle.pos = pos
>>> # Use mean for smoother tracking
>>> pos = tracker.get_gaze_position(method="mean")
>>> # Lowest latency (last sample only)
>>> pos = tracker.get_gaze_position(method="last")
>>> # Return None instead of offscreen position
>>> pos = tracker.get_gaze_position(fallback_offscreen=False)
>>> if pos is None:
...     print("No valid gaze data")

load_calibration

ETracker.load_calibration(filename=None, use_gui=False)

Loads calibration data from a file and applies it to the eye tracker.

This method allows reusing a previously saved calibration, which can save significant time for participants, especially in multi-session studies. The calibration data must be a binary file generated by a Tobii eye tracker, typically via the save_calibration() method. This operation is only available when connected to a physical eye tracker.

Parameters

Name Type Description Default
filename str The path to the calibration data file (e.g., “subject_01_calib.dat”). If use_gui is True, this path is used as the default suggestion in the file dialog. If use_gui is False, this parameter is required. None
use_gui bool If True, a graphical file-open dialog is displayed for the user to select the calibration file. Defaults to False. False

Returns

Name Type Description
bool Returns True if the calibration was successfully loaded and applied, and False otherwise (e.g., user cancelled the dialog, file not found, or data was invalid).

Raises

Name Type Description
RuntimeError If the method is called while the ETracker is in simulation mode.
ValueError If use_gui is False and filename is not provided.

Examples

# Load calibration from specific file
success = ET_controller.load_calibration('subject_01_calib.dat')
if success:
    ET_controller.start_recording('subject_01_data.h5')

# Use GUI to select file
success = ET_controller.load_calibration(use_gui=True)

# Multi-session workflow
# Session 1: Calibrate and save
ET_controller.calibrate(5)
ET_controller.save_calibration('participant_123.dat')
ET_controller.start_recording('session_1.h5')
# ... run experiment ...
ET_controller.stop_recording()

# Session 2: Load previous calibration
ET_controller.load_calibration('participant_123.dat')
ET_controller.start_recording('session_2.h5')
# ... run experiment ...
ET_controller.stop_recording()

record_event

ETracker.record_event(label)

Record timestamped experimental event during data collection.

Events are merged with gaze data based on timestamp proximity during save operations. Uses appropriate timing source for simulation vs. real eye tracker modes.

Parameters

Name Type Description Default
label str Descriptive label for the event (e.g., ‘trial_start’, ‘stimulus_onset’). required

Raises

Name Type Description
RuntimeWarning If called when recording is not active.

Examples

tracker.record_event(‘trial_1_start’) # … present stimulus … tracker.record_event(‘stimulus_offset’)

save_calibration

ETracker.save_calibration(filename=None, use_gui=False)

Save the current calibration data to a file.

Retrieves the active calibration data from the connected Tobii eye tracker and saves it as a binary file. This can be reloaded later with load_calibration() to avoid re-calibrating the same participant.

Parameters

Name Type Description Default
filename str | None Desired output path. If None and use_gui is False, a timestamped default name is used (e.g., ‘YYYY-mm-dd_HH-MM-SS_calibration.dat’). If provided without an extension, ‘.dat’ is appended. If an extension is already present, it is left unchanged. None
use_gui bool If True, opens a file-save dialog (Psychopy) where the user chooses the path. The suggested name respects the logic above. Default False. False

Returns

Name Type Description
bool True if saved successfully; False if cancelled, no data available, in simulation mode, or on error.

Notes

  • In simulation mode, saving is skipped and a warning is issued.
  • If use_gui is True and the dialog is cancelled, returns False.

Examples

# Save with default timestamped name
ET_controller.save_calibration()

### Save with specified filename
ET_controller.save_calibration('subject_01_calib.dat')

save_data

ETracker.save_data()

Save buffered gaze and event data to file with optimized processing.

Uses thread-safe buffer swapping to minimize lock time, then processes and saves data in CSV or HDF5 format. Events are merged with gaze data based on timestamp proximity.

This method is typically called automatically by stop_recording(), but can be called manually during recording to periodically save data and clear buffers. This is useful for long experiments to avoid memory buildup and ensure data is saved even if the program crashes.

Notes

  • Automatically called by stop_recording()
  • Safe to call during active recording
  • Clears buffers after saving
  • Events are matched to nearest gaze sample by timestamp

Examples

# Automatic saving (most common)
ET_controller.start_recording('data.h5')
# ... run experiment ...
ET_controller.stop_recording()  # Automatically calls save_data()

# Manual periodic saves for long experiments
ET_controller.start_recording('long_experiment.h5')

for trial in range(100):
    ET_controller.record_event(f'trial_{trial}_start')
    # ... present stimuli ...
    ET_controller.record_event(f'trial_{trial}_end')
    
    # Save data every 10 trials to prevent memory buildup
    if (trial + 1) % 10 == 0:
        ET_controller.save_data()  # Saves and clears buffers

ET_controller.stop_recording()

# Save data at natural break points
ET_controller.start_recording('session.h5')

# Block 1
for trial in range(20):
    # ... run trial ...
    pass
ET_controller.save_data()  # Save after block 1

# Short break
core.wait(30)

# Block 2
for trial in range(20):
    # ... run trial ...
    pass
ET_controller.save_data()  # Save after block 2

ET_controller.stop_recording()

set_eyetracking_settings

ETracker.set_eyetracking_settings(
    desired_fps=None,
    desired_illumination_mode=None,
    use_gui=False,
)

Configure and apply Tobii eye tracker settings.

This method updates the eye tracker’s sampling frequency (FPS) and illumination mode, either programmatically or via a graphical interface. It ensures that configuration changes are only made when the device is idle and connected.

Parameters

Name Type Description Default
desired_fps int Desired sampling frequency in Hz (e.g., 60, 120, 300). If None, the current frequency is retained. None
desired_illumination_mode str Desired illumination mode (e.g., ‘Auto’, ‘Bright’, ‘Dark’). If None, the current illumination mode is retained. None
use_gui bool If True, opens a PsychoPy GUI dialog that allows users to select settings interactively. Defaults to False. False

Raises

Name Type Description
RuntimeError If no physical eye tracker is connected or if the function is called in simulation mode.
ValueError If the specified FPS or illumination mode is not supported by the connected device.

Notes

  • Settings cannot be changed during active recording. If an ongoing recording is detected, a non-blocking warning is issued and the function exits safely.
  • When use_gui=True, a PsychoPy dialog window appears. It must be closed manually before the program continues.
  • After successfully applying new settings, the internal attributes self.fps and self.illum_mode are updated to reflect the current device configuration.

Examples

# Set frequency to 120 Hz programmatically
ET_controller.set_eyetracking_settings(desired_fps=120)

# Set illumination mode to 'Bright'
ET_controller.set_eyetracking_settings(desired_illumination_mode='Bright')

# Set both frequency and illumination mode
ET_controller.set_eyetracking_settings(desired_fps=120, desired_illumination_mode='Bright')

# Use GUI to select settings interactively
ET_controller.set_eyetracking_settings(use_gui=True)

show_status

ETracker.show_status(decision_key='space')

Real-time visualization of participant’s eye position in track box.

Creates interactive display showing left/right eye positions and distance from screen. Useful for positioning participants before data collection. Updates continuously until exit key is pressed.

Parameters

Name Type Description Default
decision_key str Key to press to exit visualization. Default ‘space’. 'space'

Notes

In simulation mode, use scroll wheel to adjust simulated distance. Eye positions shown as green (left) and red (right) circles.

start_recording

ETracker.start_recording(filename=None, raw_format=False)

Begin gaze data recording session.

Initializes file structure, clears any existing buffers, and starts data collection from either the eye tracker or simulation mode. Creates HDF5 or CSV files based on filename extension.

Parameters

Name Type Description Default
filename str Output filename for gaze data. If None, generates timestamp-based name. File extension determines format (.h5/.hdf5 for HDF5, .csv for CSV, defaults to .h5). None
raw_format bool If True, preserves all original Tobii SDK column names and data. If False (default), uses simplified column names and subset of columns. Raw format is useful for advanced analysis requiring full metadata. False

Examples

stop_recording

ETracker.stop_recording()

Stop gaze data recording and finalize session.

Performs complete shutdown: stops data collection, cleans up resources, saves all buffered data, and reports session summary. Handles both simulation and real eye tracker modes appropriately.

Raises

Name Type Description
UserWarning If recording is not currently active.

Notes

All pending data in buffers is automatically saved before completion. Recording duration is measured from start_recording() call.

Back to top