ETracker
ETracker(win, etracker_id=0, simulate=False, verbose=True)A high-level controller for running eye-tracking experiments with Tobii Pro and PsychoPy.
The ETracker class is a simplified Python interface designed to streamline the process of running infant eye-tracking experiments. It acts as a bridge between the Tobii Pro SDK (version 3.0 or later) and the popular experiment-building framework, PsychoPy.
This class is the central hub for your eye-tracking experiment. Instead of managing low-level SDK functions, the TobiiController provides a clean, unified workflow for key experimental tasks. It is designed to “detoxify” the process, abstracting away complex boilerplate code so you can focus on your research.
Key features include: - Experiment Control: Start, stop, and manage eye-tracking recordings with simple method calls. - Data Management: Automatically save recorded gaze data to a specified file format. - Calibration: Easily run a calibration procedure or load an existing calibration file to prepare the eye-tracker. - Seamless Integration: Built specifically to integrate with PsychoPy’s experimental loop, making it a natural fit for your existing research designs.
This class is intended to be the first object you instantiate in your experiment script. It provides a minimal yet powerful set of methods that are essential for conducting a reliable and reproducible eye-tracking study.
Methods
| Name | Description |
|---|---|
| calibrate | Run infant-friendly calibration procedure. |
| gaze_contingent | Initialize real-time gaze buffer for contingent applications. |
| get_gaze_position | Get current gaze position from rolling buffer. |
| load_calibration | Loads calibration data from a file and applies it to the eye tracker. |
| record_event | Record timestamped experimental event during data collection. |
| save_calibration | Save the current calibration data to a file. |
| save_data | Save buffered gaze and event data to file with optimized processing. |
| set_eyetracking_settings | Configure and apply Tobii eye tracker settings. |
| show_status | Real-time visualization of participant’s eye position in track box. |
| start_recording | Begin gaze data recording session. |
| stop_recording | Stop gaze data recording and finalize session. |
calibrate
ETracker.calibrate(
calibration_points=5,
infant_stims=True,
shuffle=True,
audio=True,
anim_type='zoom',
stim_size='big',
visualization_style='circles',
)Run infant-friendly calibration procedure.
Performs eye tracker calibration using animated stimuli to engage infant participants. The calibration establishes the mapping between eye position and screen coordinates, which is essential for accurate gaze data collection. Automatically selects the appropriate calibration method based on operating mode (real eye tracker vs. mouse simulation).
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
| calibration_points | int or list of tuple | Calibration pattern specification.
|
5 |
| infant_stims | True or str or list or visual stimulus | Calibration stimulus specification. Accepts multiple formats
|
True |
| shuffle | bool | Whether to randomize stimulus presentation order after any necessary repetition. Helps prevent habituation to stimulus sequence. Default is True. | True |
| audio | True or False or None or psychopy.sound.Sound | Controls attention-getting audio during calibration: -True uses the built-in looping calibration sound (default). -False or None disables audio. A psychopy.sound.Sound object may be provided for custom audio (ensure it is configured appropriately, e.g., |
True |
| anim_type | (zoom, trill) | Animation style for calibration stimuli:
|
'zoom' |
| stim_size | (big, small) | Size preset for calibration stimuli:
|
'big' |
| visualization_style | (circles, lines) | How to display calibration results:
|
'circles' |
Returns
| Name | Type | Description |
|---|---|---|
| bool | True if calibration completed successfully and was accepted by the user. False if calibration was aborted (e.g., via ESC key) or failed. |
Raises
| Type | Description |
|---|---|
| ValueError | If calibration_points is not 5, 9, or a valid list of coordinate tuples; if visualization_style is not ‘circles’ or ‘lines’; or if infant_stims format is unrecognized. |
| TypeError | If pre-loaded stimuli include unsupported types (e.g., TextStim, MovieStim). |
Examples
Calibration Point Patterns
Standard 5-point calibration
controller.calibrate(5)Comprehensive 9-point calibration
controller.calibrate(9)Custom calibration points
custom_points = [
(0.0, 0.0), # Center
(-0.5, 0.5), # Top-left
(0.5, 0.5), # Top-right
(-0.5, -0.5), # Bottom-left
(0.5, -0.5) # Bottom-right
]
controller.calibrate(custom_points)Stimulus Options
Single image file
controller.calibrate(5, infant_stims='my_stimulus.png')Multiple image files
controller.calibrate(5, infant_stims=['stim1.png', 'stim2.png', 'stim3.png'])Single shape stimulus
red_square = visual.Rect(win, size=0.08, fillColor='red', units='height')
controller.calibrate(5, infant_stims=red_square)Multiple shape stimuli
shapes = [
visual.Circle(win, radius=0.04, fillColor='red', units='height'),
visual.Rect(win, size=0.08, fillColor='blue', units='height'),
visual.Polygon(win, edges=6, radius=0.04, fillColor='green', units='height')
]
controller.calibrate(5, infant_stims=shapes, shuffle=True)Animation and Audio
Custom audio
from psychopy import sound
my_sound = sound.Sound('custom_beep.wav', loops=-1)
controller.calibrate(5, audio=my_sound)Trill animation without audio
controller.calibrate(5, audio=False, anim_type='trill')Visualization Styles
Lines visualization
controller.calibrate(5, visualization_style='lines')Circles visualization with small stimuli
controller.calibrate(5, stim_size='small', visualization_style='circles')Complete Workflows
Full calibration workflow
# Position participant
controller.show_status()
# Run calibration
success = controller.calibrate(
calibration_points=9,
infant_stims=['stim1.png', 'stim2.png'],
shuffle=True,
audio=True,
anim_type='zoom',
visualization_style='circles'
)
if success:
controller.start_recording('data.h5')
# ... run experiment ...
controller.stop_recording()gaze_contingent
ETracker.gaze_contingent(N=0.5, units='seconds')Initialize real-time gaze buffer for contingent applications.
Creates a rolling buffer that stores recent gaze samples over a specified time window or number of samples, enabling real-time gaze-contingent paradigms. Must be called before using get_gaze_position() for real-time gaze tracking.
The buffer automatically maintains the most recent samples, discarding older data. This provides a stable estimate of current gaze position by aggregating across multiple samples.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
| N | float or int | Buffer size specification. Interpretation depends on
|
0.5 |
| units | str | Unit for buffer size specification: ‘seconds’ or ‘samples’.
|
'seconds' |
Raises
| Type | Description |
|---|---|
| TypeError | If N is not numeric or units is not a string. |
| ValueError | If units is not ‘seconds’ or ‘samples’, or if calculated buffer size < 1. |
Details
- Call this method ONCE before your experimental loop
- Buffer size trades off stability vs. latency:
- Shorter duration/fewer samples: Lower latency, more noise
- Longer duration/more samples: Smoother tracking, higher latency
- For 120 Hz tracker with N=0.5 seconds: 60 samples, ~500ms latency
- For 60 Hz tracker with N=0.5 seconds: 30 samples, ~500ms latency
- For any tracker with N=5 samples: 5 samples, variable latency by fps
Examples
Basic Usage
Default time-based buffer (0.5 seconds)
# Initialize with 500ms window (adapts to tracker frequency)
ET_controller.gaze_contingent() # Uses default N=0.5, units='seconds'
ET_controller.start_recording('data.h5')
# Create gaze-contingent stimulus
circle = visual.Circle(win, radius=0.05, fillColor='red')
for frame in range(600): # 10 seconds at 60 fps
gaze_pos = ET_controller.get_gaze_position()
circle.pos = gaze_pos
circle.draw()
win.flip()
ET_controller.stop_recording()Custom time window
# 250ms window for lower latency
ET_controller.gaze_contingent(N=0.25, units='seconds')
# 1 second window for very smooth tracking
ET_controller.gaze_contingent(N=1.0, units='seconds')Sample-Based Configuration
Explicit sample count
# Exactly 5 most recent samples
ET_controller.gaze_contingent(N=5, units='samples')
# Exactly 10 samples for smoother tracking
ET_controller.gaze_contingent(N=10, units='samples')Complete Applications
Gaze-contingent window paradigm
# 300ms time window (auto-adjusts to tracker frequency)
ET_controller.gaze_contingent(N=0.3, units='seconds')
ET_controller.start_recording('gaze_window.h5')
stimulus = visual.ImageStim(win, 'image.png')
window = visual.Circle(win, radius=0.1, fillColor=None, lineColor='white')
for trial in range(20):
stimulus.draw()
for frame in range(120): # 2 seconds
gaze_pos = ET_controller.get_gaze_position()
window.pos = gaze_pos
window.draw()
win.flip()
ET_controller.record_event(f'trial_{trial}_end')
ET_controller.stop_recording()get_gaze_position
ETracker.get_gaze_position(
fallback_offscreen=True,
method='median',
coordinate_units='default',
)Get current gaze position from rolling buffer.
Aggregates recent gaze samples from both eyes to provide a stable, real-time gaze estimate. Handles missing or invalid data gracefully.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
| fallback_offscreen | bool | If True (default), returns an offscreen position (3x screen dimensions) when no valid gaze data is available. If False, returns None. | True |
| method | str | Aggregation method for combining samples and eyes:
|
'median' |
| coordinate_units | str | Target coordinate system for returned gaze position:
|
'default' |
Returns
| Name | Type | Description |
|---|---|---|
| tuple or None | Gaze position (x, y) in specified coordinates, or None if no valid data and fallback_offscreen=False. |
Raises
| Type | Description |
|---|---|
| RuntimeError | If gaze_contingent() was not called to initialize the buffer. |
Details
Coordinate System Conversion
The coordinate_units parameter determines how gaze positions are returned from the real-time buffer. By default, positions are returned in your window’s current coordinate system, making it seamless to assign gaze positions directly to stimulus objects. For example, if your window uses height units, the returned gaze position will automatically be in height units, ready to use with stimulus.pos = gaze_pos. You can override this behavior to request gaze positions in any coordinate system regardless of your window settings. Setting coordinate_units='tobii' returns the raw normalized coordinates from the eye tracker, where values range from 0 to 1 with the origin at the top-left corner.
Examples
Basic Usage
Default behavior (window’s coordinate system)
pos = ET_controller.get_gaze_position()
if pos is not None:
circle.pos = posCoordinate Systems
Get position in Tobii coordinates (0-1 range)
pos = ET_controller.get_gaze_position(coordinate_units='tobii')
# Returns: (0.5, 0.3) for gaze at center-leftGet position in pixels (center origin)
pos = ET_controller.get_gaze_position(coordinate_units='pix')
# Returns: (120, -50) for gaze slightly right and below centerGet position in height units
pos = ET_controller.get_gaze_position(coordinate_units='height')
# Returns: (0.15, -0.08) in height unitsAggregation Methods
Mean for smoother tracking
pos = ET_controller.get_gaze_position(method="mean")Last sample for lowest latency
pos = ET_controller.get_gaze_position(method="last")Handling Missing Data
Return None instead of offscreen position
pos = ET_controller.get_gaze_position(fallback_offscreen=False)
if pos is None:
print("No valid gaze data")Check for offscreen gaze
pos = ET_controller.get_gaze_position(fallback_offscreen=True)
# Offscreen positions will be far outside window bounds
if abs(pos[0]) > 2.0 or abs(pos[1]) > 2.0:
print("Participant looking away")Complete Application
Gaze-contingent stimulus in normalized coordinates
ET_controller.gaze_contingent(N=0.3, units='seconds')
ET_controller.start_recording('data.h5')
circle = visual.Circle(win, radius=0.05, fillColor='red', units='norm')
for frame in range(600):
# Get gaze in normalized coordinates to match circle
gaze_pos = ET_controller.get_gaze_position(coordinate_units='norm')
circle.pos = gaze_pos
circle.draw()
win.flip()
ET_controller.stop_recording()load_calibration
ETracker.load_calibration(
filename=None,
use_gui=False,
screen=-1,
alwaysOnTop=True,
)Loads calibration data from a file and applies it to the eye tracker.
This method allows reusing a previously saved calibration, which can save significant time for participants, especially in multi-session studies. The calibration data must be a binary file generated by a Tobii eye tracker, typically via the save_calibration() method. This operation is only available when connected to a physical eye tracker.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
| filename | str | The path to the calibration data file (e.g., “subject_01_calib.dat”). If use_gui is True, this path is used as the default suggestion in the file dialog. If use_gui is False, this parameter is required. |
None |
| use_gui | bool | If True, a graphical file-open dialog is displayed for the user to select the calibration file. Defaults to False. |
False |
| screen | int | Screen number where the GUI dialog is displayed. Only used when use_gui=True. If -1 (default), the dialog appears on the primary screen. Use 0, 1, 2, etc. to specify other monitors. Ignored when use_gui=False. |
-1 |
| alwaysOnTop | bool | Whether the GUI dialog stays on top of other windows. Only used when use_gui=True. Default is True to prevent the dialog from being hidden behind experiment windows. Ignored when use_gui=False. |
True |
Returns
| Name | Type | Description |
|---|---|---|
| bool | Returns True if the calibration was successfully loaded and applied, and False otherwise (e.g., user cancelled the dialog, file not found, or data was invalid). |
Raises
| Type | Description |
|---|---|
| RuntimeError | If the method is called while the ETracker is in simulation mode. |
| ValueError | If use_gui is False and filename is not provided. |
Examples
Load calibration from specific file
success = ET_controller.load_calibration('subject_01_calib.dat')
if success:
ET_controller.start_recording('subject_01_data.h5')Use GUI to select file
success = ET_controller.load_calibration(use_gui=True)GUI on secondary monitor
success = ET_controller.load_calibration(
use_gui=True,
screen=1,
alwaysOnTop=False
)Multi-session workflow
# Session 1: Calibrate and save
ET_controller.calibrate(5)
ET_controller.save_calibration('participant_123.dat')
ET_controller.start_recording('session_1.h5')
# ... run experiment ...
ET_controller.stop_recording()
# Session 2: Load previous calibration
ET_controller.load_calibration('participant_123.dat')
ET_controller.start_recording('session_2.h5')
# ... run experiment ...
ET_controller.stop_recording()record_event
ETracker.record_event(label)Record timestamped experimental event during data collection.
Events are merged with gaze data based on timestamp proximity during save operations. Uses appropriate timing source for simulation vs. real eye tracker modes.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
| label | str | Descriptive label for the event (e.g., ‘trial_start’, ‘stimulus_onset’). | required |
Raises
| Type | Description |
|---|---|
| RuntimeWarning | If called when recording is not active. |
Details
Event-Gaze Synchronization
Events are stored separately and merged with gaze data when save_data() is called. Each event is aligned to the next closest gaze sample (at or after the event timestamp) using binary search. When multiple events occur within the same sampling interval, they are concatenated with semicolon delimiters in the Events column (e.g., ‘fixation_offset; stimulus_onset’). This ensures no event data is lost even when events occur in rapid succession.
Examples
Basic Usage
Recording single events
ET_controller.record_event('trial_1_start')
# ... present stimulus ...
ET_controller.record_event('stimulus_offset')Common Patterns
Complete trial structure
ET_controller.record_event('trial_1_start')
ET_controller.record_event('fixation_onset')
core.wait(1.0)
ET_controller.record_event('fixation_offset')
ET_controller.record_event('stimulus_onset')
# ... show stimulus ...
ET_controller.record_event('stimulus_offset')
ET_controller.record_event('response_prompt')
# ... wait for response ...
ET_controller.record_event('response_recorded')
ET_controller.record_event('trial_1_end')Multi-trial experiment
ET_controller.start_recording('experiment.h5')
for trial_num in range(10):
ET_controller.record_event(f'trial_{trial_num}_start')
# ... run trial ...
ET_controller.record_event(f'trial_{trial_num}_end')
ET_controller.stop_recording()Rapid successive events
# Events occurring within same sampling interval
ET_controller.record_event('fixation_offset')
ET_controller.record_event('stimulus_onset')
# In saved data, may appear as: "fixation_offset; stimulus_onset"save_calibration
ETracker.save_calibration(
filename=None,
use_gui=False,
screen=-1,
alwaysOnTop=True,
)Save the current calibration data to a file.
Retrieves the active calibration data from the connected Tobii eye tracker and saves it as a binary file. This can be reloaded later with load_calibration() to avoid re-calibrating the same participant.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
| filename | str | None | Desired output path. If None and use_gui is False, a timestamped default name is used (e.g., ‘YYYY-mm-dd_HH-MM-SS_calibration.dat’). If provided without an extension, ‘.dat’ is appended. If an extension is already present, it is left unchanged. |
None |
| use_gui | bool | If True, opens a file-save dialog (Psychopy) where the user chooses the path. The suggested name respects the logic above. Default False. | False |
| screen | int | Screen number where the GUI dialog is displayed. Only used when use_gui=True. If -1 (default), the dialog appears on the primary screen. Use 0, 1, 2, etc. to specify other monitors. Ignored when use_gui=False. |
-1 |
| alwaysOnTop | bool | Whether the GUI dialog stays on top of other windows. Only used when use_gui=True. Default is True to prevent the dialog from being hidden behind experiment windows. Ignored when use_gui=False. |
True |
Returns
| Name | Type | Description |
|---|---|---|
| bool | True if saved successfully; False if cancelled, no data available, in simulation mode, or on error. |
Details
- In simulation mode, saving is skipped and a warning is issued.
- If
use_guiis True and the dialog is cancelled, returns False.
Examples
Save with default timestamped name
ET_controller.save_calibration()Save with specified filename
ET_controller.save_calibration('subject_01_calib.dat')Use GUI to choose save location
ET_controller.save_calibration(use_gui=True)GUI on secondary monitor
ET_controller.save_calibration(use_gui=True, screen=1, alwaysOnTop=False)save_data
ETracker.save_data()Save buffered gaze and event data to file with optimized processing.
Uses thread-safe buffer swapping to minimize lock time, then processes and saves data in CSV or HDF5 format. Events are merged with gaze data based on timestamp proximity.
This method is typically called automatically by stop_recording(), but can be called manually during recording to periodically save data and clear buffers. This is useful for long experiments to avoid memory buildup and ensure data is saved even if the program crashes.
Details
- Automatically called by
stop_recording() - Safe to call during active recording
- Clears buffers after saving
- Events are matched to nearest gaze sample by timestamp
- In HDF5 format, events are saved in two places:
- Merged into the main gaze table’s ‘Events’ column
- As a separate ‘events’ table for independent event analysis
- In CSV format, events only appear in the ‘Events’ column
Examples
Automatic Usage
Default behavior (most common)
ET_controller.start_recording('data.h5')
# ... run experiment ...
ET_controller.stop_recording() # Automatically calls save_data()Manual Periodic Saves
Save every N trials for long experiments
ET_controller.start_recording('long_experiment.h5')
for trial in range(100):
ET_controller.record_event(f'trial_{trial}_start')
# ... present stimuli ...
ET_controller.record_event(f'trial_{trial}_end')
# Save data every 10 trials to prevent memory buildup
if (trial + 1) % 10 == 0:
ET_controller.save_data() # Saves and clears buffers
ET_controller.stop_recording()Strategic Save Points
Save at natural break points between blocks
ET_controller.start_recording('session.h5')
# Block 1
for trial in range(20):
# ... run trial ...
pass
ET_controller.save_data() # Save after block 1
# Short break
core.wait(30)
# Block 2
for trial in range(20):
# ... run trial ...
pass
ET_controller.save_data() # Save after block 2
ET_controller.stop_recording()set_eyetracking_settings
ETracker.set_eyetracking_settings(
desired_fps=None,
desired_illumination_mode=None,
use_gui=False,
screen=-1,
alwaysOnTop=True,
)Configure and apply Tobii eye tracker settings.
This method updates the eye tracker’s sampling frequency (FPS) and illumination mode, either programmatically or via a graphical interface. It ensures that configuration changes are only made when the device is idle and connected.
After applying settings, a summary is displayed showing which settings changed and which remained the same.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
| desired_fps | int | Desired sampling frequency in Hz (e.g., 60, 120, 300). If None, the current frequency is retained. When use_gui=True, this value pre-populates the dialog box dropdown. |
None |
| desired_illumination_mode | str | Desired illumination mode (e.g., ‘Auto’, ‘Bright’, ‘Dark’). If None, the current illumination mode is retained. When use_gui=True, this value pre-populates the dialog box dropdown. |
None |
| use_gui | bool | If True, opens a PsychoPy GUI dialog with dropdown menus that allows users to select settings interactively. The dropdowns are pre-populated with values from desired_fps and desired_illumination_mode if provided, or current settings if None. Defaults to False. |
False |
| screen | int | Screen number where the GUI dialog is displayed. Only used when use_gui=True. If -1 (default), the dialog appears on the primary screen. Use 0, 1, 2, etc. to specify other monitors. Ignored when use_gui=False. |
-1 |
| alwaysOnTop | bool | Whether the GUI dialog stays on top of other windows. Only used when use_gui=True. Default is True to prevent the dialog from being hidden behind experiment windows. Ignored when use_gui=False. |
True |
Raises
| Type | Description |
|---|---|
| RuntimeError | If no physical eye tracker is connected or if the function is called in simulation mode. |
| ValueError | If the specified FPS or illumination mode is not supported by the connected device. |
Details
- Settings cannot be changed during active recording. If an ongoing recording is detected, a non-blocking warning is issued and the function exits safely.
- When
use_gui=True, a PsychoPy dialog window appears with dropdown menus. ThescreenandalwaysOnTopparameters control its display behavior. - After successfully applying new settings, the internal attributes
self.fpsandself.illum_modeare updated to reflect the current device configuration. - A summary of applied changes is displayed using NicePrint, showing which settings changed (with old –> new values) and which remained unchanged.
Examples
Programmatic Settings
Set frequency to 120 Hz
ET_controller.set_eyetracking_settings(desired_fps=120)Set illumination mode to ‘Bright’
ET_controller.set_eyetracking_settings(desired_illumination_mode='Bright')Set both frequency and illumination mode
ET_controller.set_eyetracking_settings(
desired_fps=120,
desired_illumination_mode='Bright'
)GUI-Based Settings
Open GUI with default settings
ET_controller.set_eyetracking_settings(use_gui=True)GUI on secondary monitor without always-on-top
ET_controller.set_eyetracking_settings(
use_gui=True,
screen=1,
alwaysOnTop=False
)GUI with pre-selected 120 Hz in dropdown
ET_controller.set_eyetracking_settings(
desired_fps=120,
use_gui=True
)show_status
ETracker.show_status(decision_key='space', video_help=True)Real-time visualization of participant’s eye position in track box.
Creates interactive display showing left/right eye positions and distance from screen. Useful for positioning participants before data collection. Updates continuously until exit key is pressed.
Optionally displays an instructional video in the background to help guide participant positioning. You can use the built-in video, disable the video, or provide your own custom MovieStim object.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
| decision_key | str | Key to press to exit visualization. Default ‘space’. | 'space' |
| video_help | bool or visual.MovieStim | Controls background video display: - True: Uses built-in instructional video (default) - False: No video displayed - visual.MovieStim: Uses your pre-loaded custom video. You are responsible for scaling (size) and positioning (pos) the MovieStim to fit your desired layout. Default True. | True |
Details
In simulation mode, use scroll wheel to adjust simulated distance. Eye positions shown as green (left) and red (right) circles.
The built-in video (when video_help=True) is sized at (1.06, 0.6) in height units and positioned at (0, -0.08) to avoid covering the track box.
Examples
Basic Usage
Default with built-in video
ET_controller.show_status()Without background video
ET_controller.show_status(video_help=False)Customization Options
Custom exit key
ET_controller.show_status(decision_key='return')Custom video with specific size and position
from psychopy import visual
my_video = visual.MovieStim(
win,
'instructions.mp4',
size=(0.8, 0.6),
pos=(0, -0.1)
)
ET_controller.show_status(video_help=my_video)Complete Workflows
Position participant before calibration
# Position participant
ET_controller.show_status()
# Run calibration
success = ET_controller.calibrate(5)
# Start recording if calibration successful
if success:
ET_controller.start_recording('data.h5')start_recording
ETracker.start_recording(
filename=None,
raw_format=False,
coordinate_units='default',
relative_timestamps='default',
)Begin gaze data recording session.
Initializes file structure, clears any existing buffers, and starts data collection from either the eye tracker or simulation mode. Creates HDF5 or CSV files based on filename extension.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
| filename | str | Output filename for gaze data. If None, generates timestamp-based name. File extension determines format (.h5/.hdf5 for HDF5, .csv for CSV, defaults to .h5). | None |
| raw_format | bool | If True, preserves all original Tobii SDK column names and data (including 3D eye positions, gaze origins, etc.). If False (default), uses simplified column names and subset of columns (gaze positions, pupil diameters, validity flags only). See Data Format Options for more information. | False |
| coordinate_units | str | Target coordinate system for 2D screen positions. Default is ‘default’:
Converts to PsychoPy coordinate systems where (0,0) is at screen center. Other options: ‘tobii’, ‘height’, ‘norm’, ‘cm’, ‘deg’. See Coordinate System Conversion for more information. |
'default' |
| relative_timestamps | str or bool | Controls timestamp format. Default is ‘default’:
Other options: True (always relative), False (always absolute). See Timestamp Format for more information. |
'default' |
Details
Data Format Options
The raw_format parameter controls which columns are included in the saved data file. Raw format preserves the complete data structure from the Tobii Pro SDK, which includes both 2D screen coordinates and 3D spatial information about eye position and gaze origin in the user coordinate system. This format is useful for advanced analyses that require the full geometric relationship between the eyes and the screen, or when you need to preserve all metadata provided by the eye tracker. The simplified format extracts only the essential data needed for most eye tracking analyses: gaze positions on screen, pupil diameters, and validity flags. This results in smaller files and easier data analysis for typical gaze visualization and AOI (Area of Interest) tasks.
Coordinate System Conversion
The coordinate_units parameter determines how 2D screen coordinates are represented in the output file. By default, raw format keeps coordinates in Tobii’s Active Display Coordinate System (ADCS) where values range from 0 to 1 with the origin at the top-left corner. This normalized system is screen-independent and useful for comparing data across different display setups. The simplified format defaults to PsychoPy pixel coordinates, which place the origin at the screen center with the y-axis pointing upward. This matches PsychoPy’s coordinate system and makes it seamless to overlay gaze data on your experimental stimuli. You can override these defaults for either format. For example, you might want pixel coordinates in raw format for easier visualization, or keep Tobii coordinates in simplified format for cross-screen comparisons. When converting to other PsychoPy units like ‘height’ or ‘norm’, the coordinates will match your PsychoPy window’s unit system. Note that in raw format, only the 2D display coordinates are converted; the 3D spatial coordinates (eye positions and gaze origins) always remain in meters as provided by the Tobii SDK.
Timestamp Format
The relative_timestamps parameter controls how time is represented in your data. Absolute timestamps preserve the exact microsecond values from the Tobii system clock, which are useful when you need to synchronize eye tracking data with other recording systems or when collecting multiple data files that need to be aligned temporally. Relative timestamps convert the first sample to time zero and express all subsequent times in milliseconds relative to that starting point, making it much easier to analyze individual trials or sessions. For example, with relative timestamps you can immediately see that an event occurred at 1500ms into your recording, whereas with absolute timestamps you would need to subtract the session start time first. The default behavior chooses relative timestamps for simplified format (since most analyses focus on within-session timing) and absolute timestamps for raw format (to preserve the complete temporal information). Both gaze samples and event markers use the same timestamp format to ensure proper synchronization when analyzing your data.
Examples
Basic Usage
Standard simplified format (PsychoPy pixels, relative timestamps)
ET_controller.start_recording('data.h5')
# Creates: Left_X/Left_Y in pixels (center origin), TimeStamp from 0msAuto-generated timestamped filename
ET_controller.start_recording() # Creates YYYY-MM-DD_HH-MM-SS.h5Format Options
Raw format with defaults (Tobii coords, absolute timestamps)
ET_controller.start_recording('data.h5', raw_format=True)
# All Tobii columns, coords in 0-1 range, timestamps in microsecondsRaw format with pixel coordinates
ET_controller.start_recording('data.h5', raw_format=True, coordinate_units='pix')
# All Tobii columns, display coords in PsychoPy pixelsTimestamp Control
Raw format with relative timestamps
ET_controller.start_recording('data.h5', raw_format=True, relative_timestamps=True)
# Raw format but timestamps start at 0msSimplified format with absolute timestamps
ET_controller.start_recording('data.h5', relative_timestamps=False)
# Simplified format but keeps absolute microsecond timestampsCoordinate Units
Simplified format with Tobii coordinates
ET_controller.start_recording('data.h5', coordinate_units='tobii')
# Simplified columns, coordinates in 0-1 rangeSimplified format with height units
ET_controller.start_recording('data.h5', coordinate_units='height')
# Simplified columns, coordinates in height units matching windowComplete Workflows
Standard experiment workflow
# Setup and calibration
ET_controller.show_status()
ET_controller.calibrate(5)
# Start recording with defaults
ET_controller.start_recording('participant_01.h5')
# Run experiment with event markers
ET_controller.record_event('trial_1_start')
# ... present stimuli ...
ET_controller.record_event('trial_1_end')
# Stop recording
ET_controller.stop_recording()Multi-file synchronization with absolute timestamps
# Recording multiple synchronized files
ET_controller.start_recording('session1_gaze.h5', relative_timestamps=False)
# ... run session 1 ...
ET_controller.stop_recording()
# Session 2 can be synchronized using absolute timestamps
ET_controller.start_recording('session2_gaze.h5', relative_timestamps=False)
# ... run session 2 ...
ET_controller.stop_recording()Advanced raw data collection
# Collect all data with pixel coords and relative time
ET_controller.start_recording(
'advanced_analysis.h5',
raw_format=True,
coordinate_units='pix',
relative_timestamps=True
)stop_recording
ETracker.stop_recording(data_check=True)Stop gaze data recording and finalize session.
Performs complete shutdown: stops data collection, cleans up resources, saves all buffered data, and optionally performs a comprehensive data quality check. Handles both simulation and real eye tracker modes appropriately.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
| data_check | bool | If True (default), performs data quality check by reading the saved file and analyzing timestamp gaps to detect dropped samples. If False, skips the quality check and completes faster. Default True. | True |
Raises
| Type | Description |
|---|---|
| UserWarning | If recording is not currently active. |
Details
- All pending data in buffers is automatically saved before completion
- Recording duration is measured from start_recording() call
- Quality check reads the complete saved file to analyze gaps between ALL samples, including potential gaps between save_data() calls
- At 120Hz, each sample should be ~8333µs apart; gaps significantly larger indicate dropped samples
Examples
Basic Usage
Standard stop with quality check
ET_controller.start_recording('data.h5')
# ... run experiment ...
ET_controller.stop_recording() # Shows quality reportSkip quality check for faster shutdown
ET_controller.stop_recording(data_check=False)Understanding Output
Expected quality check report
# When data_check=True, you'll see output like:
# ╔════════════════════════════╗
# ║ Recording Complete ║
# ╠════════════════════════════╣
# ║ Data collection lasted ║
# ║ approximately 120.45 sec ║
# ║ Data has been saved to ║
# ║ experiment.h5 ║
# ║ ║
# ║ Data Quality Report: ║
# ║ - Total samples: 14454 ║
# ║ - Dropped samples: 0 ║
# ╚════════════════════════════╝