import os
from pathlib import Path
from psychopy import core, event, visual,sound
from DeToX import ETrackerCreate an eye-tracking experiment
PsychoPy, Python, eye-tracking, tobii, tobii_research, experimental psychology, tutorial, experiment, DevStart, developmental science
This page will show you how to collect eye-tracking data in a simple PsychoPy paradigm. We will build upon the exact same paradigm we created in the Getting started with Psychopy tutorial. If you have not completed that tutorial yet, please go through it first, as we will be modifying that existing code.
The Tool: DeToX
For this tutorial, we will be using DeToX (βDetoxify your Eye Trackingβ).
Why DeToX? In previous versions of this tutorial, we used the raw tobii_research library. While powerful, using the raw SDK required writing complex βcallback functions,β managing memory buffers manually, and performing difficult timestamp synchronizations between the computer clock and the eye-tracker clock.
Tobii eye-tracker
This tutorial is specifically designed for Tobii eye-trackers. While the general logic of eye-tracking (calibration, recording, event logging) applies to any hardware, the specific code and installation steps here are for Tobii devices.
DeToX is a wrapper library we built to handle all that heavy lifting for you. It simplifies the workflow by:
Automatically handling the connection to the tracker.
Managing data streams and saving files (HDF5).
Synchronizing your event markers (e.g., βStimulus Onsetβ) with the eye-tracking timestamps.
This page will show you how to collect eye-tracking data in a simple Psychopy paradigm using DeToX. We will use the same paradigm that we built together in the Getting started with Psychopy tutorial. If you have not done that tutorial yet, please go through it first.
While we created DeToX to be simple and infant friendly, it is not the only tool out there! You might also want to check:
Titta: A toolbox for Matlab and Python that also interfaces with Tobii trackers.
PsychoPyβs Built-in I/O: PsychoPy has its own
iohubmodule that supports various eye-trackers (SR Research Eyelink, Tobii, Gazepoint, etc.).
Preparation
Letβs begin importing the libraries that we will need for this example
Most of these imports should look familiar from our previous tutorialβwe need them to locate files, handle timing, and present our stimuli.
The new addition here is ETracker. This is DeToXβs main class and acts as your central hub for all eye-tracking operations. Throughout your experiment, you will interact with this single object to handle everything from calibration to data recording.
Window
As we have seen in previous tutorials, every PsychoPy experiment needs a window. This is the canvas where all your stimuli will appear and where participants will interact with your study.
Crucially, the ETracker class requires this window object to function properly. It uses the windowβs properties (like size and unit system) to correctly map eye-tracking coordinates to your screen.
Letβs create one now:
# Create the experiment window
win = visual.Window(
size=[1920, 1080], # Window dimensions in pixels
fullscr=True, # Expand to fill the entire screen
units='pix' # Use pixels as the measurement unit
)Connect to the eye-tracker
With our window prepared, the next step is to establish a connection to the hardware. Unlike standard SDKs that require you to manually search for devices and manage data streams, DeToX streamlines this process through a single main controller: the ETracker.
To initialize it, simply pass your PsychoPy window object. DeToX will automatically locate the first available Tobii tracker and configure the coordinate systems to match your screen settings.
ET_controller = ETracker(win)If youβre following along without a Tobii eye tracker connected, you can still test everything using simulation mode. Just pass simulate=True when creating your ETracker:
ET_controller = ETracker(win, simulate=True)This tells DeToX to collect data from your mouse position instead of an actual eye trackerβperfect for development, testing, or learning the workflow before you have hardware access π
Upon execution, DeToX connects to the device and prints a confirmation summary. This is a quick way to verify that your tracker is detected and running at the correct frequency:
βββββββββββββββββββ Eyetracker Info βββββββββββββββββββ
βConnected to the eyetracker: β
β - Model: Tobii Pro Fusion β
β - Current frequency: 250.0 Hz β
β - Current illumination mode: Default β
βOther options: β
β - Possible frequencies: (30.0, 60.0, 120.0, 250.0) β
β - Possible illumination modes: ('Default',) β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββNow that we are connected, we are ready to start recording!
Collect data
Great! Youβre now connected to the eye-tracker (or simulating it). However, weβre not actually collecting any data yetβletβs fix that.
To begin data collection, simply call the start_recording method on your controller:
# Start recording data
ET_controller.start_recording(filename="testing.h5")This sets everything in motion. The eye-tracker will now continuously collect data in the background while you run your experiment.
We use the HDF5 format (ending in .h5), which is a modern and efficient way to store scientific data. It keeps everything organized and fast, so you donβt have to worry about managing massive, messy text files.
If you donβt provide a filename, DeToX will automatically generate one for you based on the current time. For more details on the data structure, check the DeToX website.
Triggers/Events
We have successfully started recording, and data is now being collected continuously in the background. Now you are free to present images, videos, sounds, or whatever your experimental design requires!
However, presenting stimuli is only half the battle. While the eye tracker records where the participant is looking, it is crucial to mark when specific events happen (e.g., βImage appearedβ, βSound startedβ).
Without these markers (or βtriggersβ), your data will just be a long, unbroken stream of coordinates, making it impossible to determine what the participant was looking at at any given moment. This synchronization is essential for analysis.
How to Record an Event
To mark a specific moment in time, simply call the record_event function. DeToX automatically captures the precise system timestamp and merges it into your data file.
You should call this function right after your window flips. Since win.flip() is the moment the stimulus actually appears on the screen, recording the event right after ensures your timestamp is as accurate as possible.
# Send event
ET_controller.record_event('Event number 1')Save data
While the recording is active, your data (and events) are held in your computerβs short-term memory (RAM) for speed. To make this data permanent, it must be written to a file on your hard drive.
There are two ways to do this:
The Standard Way: Save at the End
The simplest approach is to save everything once your experiment finishes. When you are done, simply call:
# Stop recording and save everything
ET_controller.stop_recording()This function performs three critical tasks at once:
Stops the data stream from the eye tracker.
Saves all data currently in memory to your file.
Safely disconnects from the device.
You will see a confirmation message summarizing the session:
βββββββββββββββ Recording Complete ββββββββββββββββ
βData collection lasted approximately 4.02 secondsβ
βData has been saved to testing.h5 β
βββββββββββββββββββββββββββββββββββββββββββββββββββThe βSafeβ Way: Periodic Saving
If your experiment is short, saving at the end is perfectly fine. However, for longer studies, we highly recommend saving intermittently.
If your computer were to crash halfway through a long session, you would lose all the data currently sitting in memory. To prevent this, you can βflushβ the data to the disk during quiet moments, such as an Inter-Stimulus Interval (ISI) or a break.
Simply call this method whenever you want to secure the data collected so far:
# Append current data to file and clear memory
ET_controller.save_data()This takes whatever is in memory, appends it to your file, and clears the buffer to free up RAM.
Note: Even if you use save_data() periodically, you must still call stop_recording() at the very end of your experiment to save the final chunk of data and disconnect from the eye-tracker.
Create the Actual Experiment
Now that we have seen how to record and save data, letβs see how to combine these pieces into a complete study.
Short Recap of the Paradigm
Weβll use the experimental design from Getting started with PsychoPy and add eye tracking to it. If you need a refresher on the paradigm, take a quick look at that tutorial.
Hereβs a brief summary: After a fixation cross, participants see either a circle or square. The circle predicts a complex shape that will appear on the right side of the screen, while the square predicts a simple shape will on the left.
Putting It All Together
Letβs build the complete experiment step by step.
Import Libraries and load the Stimuli
First, we need to import our libraries, create the window, initialize the eye tracker, and load our stimuli.
Again this part is identical to our previous PsychoPy tutorial:
import os
from pathlib import Path
from psychopy import core, event, visual,sound
from DeToX import ETracker
#%% Load and prepare stimuli
# Setting the directory of our experiment
os.chdir(r'<<< YOUR PATH >>>>')
# Now create a Path object for the stimuli directory
stimuli_dir = Path('EXP') / 'Stimuli'
# Load images
fixation = visual.ImageStim(win, image=str(stimuli_dir / 'fixation.png'), size=(200, 200))
circle = visual.ImageStim(win, image=str(stimuli_dir / 'circle.png'), size=(200, 200))
square = visual.ImageStim(win, image=str(stimuli_dir / 'square.png'), size=(200, 200))
complex = visual.ImageStim(win, image=str(stimuli_dir / 'complex.png'), size=(200, 200), pos=(250, 0))
simple = visual.ImageStim(win, image=str(stimuli_dir / 'simple.png'), size=(200, 200), pos=(-250, 0))
# Load sound
presentation_sound = sound.Sound(str(stimuli_dir / 'presentation.wav'))
# List of stimuli
cues = [circle, square] # put both cues in a list
targets = [complex, simple] # put both rewards in a list
# Create a list of trials in which 0 means winning and 1 means losing
Trials = [0, 1, 0, 0, 1, 0, 1, 1, 0, 1 ]Start recording
Now we are ready to connect to the eye tracker and start collecting data. With DeToX, this is just two lines of code: one to initialize the connection and one to start the recording stream.
#%% Record the data
# Connect to the eye tracker
ET_controller = ETracker(win)
# Start recording
# DeToX will automatically create this file and start saving data to it
ET_controller.start_recording(filename="testing.h5")Present Our Stimuli
The eye tracking is running! Now we can loop through our trials and show the participant our stimuli.
The most critical step here is to mark exactly when a stimulus appears on the screen. We do this by sending an βevent markerβ to the data file. With DeToX, this is incredibly simple: immediately after win.flip() (which updates the screen), we call ET_controller.record_event('Label').
You will also notice the special trick we use during the Inter-Stimulus Interval (ISI) to save our data safely without disrupting the experiment timing.
#%% Trials
for trial in Trials:
### 1. Present the Fixation
fixation.draw()
win.flip() # Stimulus appears
ET_controller.record_event('Fixation') # Log event immediately
core.wait(1)
### 2. Present the Cue
cues[trial].draw()
win.flip()
# Log specific cue type
if trial == 0:
ET_controller.record_event('Circle')
else:
ET_controller.record_event('Square')
core.wait(3)
### 3. Wait for Saccadic Latency
win.flip()
core.wait(0.75)
### 4. Present the Target
targets[trial].draw()
win.flip()
if trial == 0:
ET_controller.record_event('Complex')
else:
ET_controller.record_event('Simple')
presentation_sound.play()
core.wait(2)
### 5. ISI and Smart Saving
win.flip()
ET_controller.record_event('ISI')
# Start a clock to measure our ISI duration
clock = core.Clock()
# --- SAVE DATA ---
# Flush the data currently in memory to the disk
ET_controller.save_data()
# Wait for whatever time is left in the 1-second ISI
# This ensures the ISI is exactly 1s, even if saving took 0.1s
core.wait(1 - clock.getTime())
### Check for escape key to exit
keys = event.getKeys()
if 'escape' in keys:
ET_controller.stop_recording()
win.close()
core.quit()A Note on Smart Saving
Did you catch the logic inside the ISI section?
As we mentioned in Save data, it is best to save your data intermittently to avoid loss if the computer crashes. The ISI is the perfect moment for this because the participant is just looking at a blank screen.
If you remember from our Getting Started with PsychoPy tutorial, we used a core.Clock() for the ISI instead of a simple core.wait(). This is exactly why!
Start Clock: We start a timer immediately when the ISI begins.
Save Data: We call
ET_controller.save_data(). This might take 10ms or 50ms depending on your computer.Wait for Remainder: We calculate
1 - clock.getTime().
This subtraction is the βcoolβ part. It ensures that the total ISI is exactly 1 second, automatically subtracting the time it took to save the data. If we just used core.wait(1) after saving, our ISI would be too long (1s + saving time).
Careful!!!
If saving the data takes more than 1 second, your ISI will also be longer. However, this should not be the case with typical studies where trials are not too long. Nonetheless, itβs always a good idea to keep an eye out.
Stop recording
Almost done! Weβve collected data, sent events, and saved everything. The final step is to stop data collection (otherwise Python will keep getting endless data from the eye tracker!). We simply unsubscribe from the eye tracker:
# --- End Experiment ---
ET_controller.stop_recording() # Save remaining data and disconnect
win.close()
core.quit()Note that we also closed the Psychopy window, so that the stimulus presentation is also officially over. Well done!!! Now go and get your data!!! Weβll see you back when itβs time to analyze it.
END!!
Great job getting to here!! it want easy but you did it. Here is all the code we made together.
import os
from pathlib import Path
from psychopy import core, event, visual, sound
from DeToX import ETracker
#%% Setup and Imports
# Screen dimensions
winsize = [1920, 1080]
# Setting the directory of our experiment
os.chdir(r'<<< YOUR PATH >>>>')
# Create Paths
stimuli_dir = Path('EXP') / 'Stimuli'
data_dir = Path('DATA') / 'RAW'
# 1. Create the Window
# We need this first so DeToX can measure pixels correctly
win = visual.Window(size=winsize, fullscr=True, units="pix", pos=(0,30), screen=1)
# 2. Initialize DeToX
# This automatically connects to the eye tracker
ET_controller = ETracker(win)
#%% Load Stimuli
# Load images
fixation = visual.ImageStim(win, image=str(stimuli_dir / 'fixation.png'), size=(200, 200))
circle = visual.ImageStim(win, image=str(stimuli_dir / 'circle.png'), size=(200, 200))
square = visual.ImageStim(win, image=str(stimuli_dir / 'square.png'), size=(200, 200))
# Note positions for targets
complex_stim = visual.ImageStim(win, image=str(stimuli_dir / 'complex.png'), size=(200, 200), pos=(250, 0))
simple_stim = visual.ImageStim(win, image=str(stimuli_dir / 'simple.png'), size=(200, 200), pos=(-250, 0))
# Load sound
presentation_sound = sound.Sound(str(stimuli_dir / 'presentation.wav'))
# List of stimuli
cues = [circle, square]
targets = [complex_stim, simple_stim]
# Create a list of trials (0 = Circle/Complex, 1 = Square/Simple)
Trials = [0, 1, 0, 0, 1, 0, 1, 1, 0, 1]
Sub = 'S001'
#%% Start Recording
# Start recording to an HDF5 file
# DeToX creates the file and starts the stream immediately
ET_controller.start_recording(filename=str(data_dir / f"{Sub}.h5"))
#%% Run Trials
for trial in Trials:
### 1. Present the Fixation
fixation.draw()
win.flip()
ET_controller.record_event('Fixation') # Log event
core.wait(1)
### 2. Present the Cue
cues[trial].draw()
win.flip()
if trial == 0:
ET_controller.record_event('Circle')
else:
ET_controller.record_event('Square')
core.wait(3)
### 3. Wait for Saccadic Latency
win.flip()
core.wait(0.75)
### 4. Present the Target
targets[trial].draw()
win.flip()
if trial == 0:
ET_controller.record_event('Complex')
else:
ET_controller.record_event('Simple')
presentation_sound.play()
core.wait(2)
### 5. ISI and Smart Saving
win.flip()
ET_controller.record_event('ISI')
# Start timer for ISI
clock = core.Clock()
# Save data to disk safely
ET_controller.save_data()
# Wait for the remainder of the 1-second ISI
core.wait(1 - clock.getTime())
### Check for escape key
keys = event.getKeys()
if 'escape' in keys:
ET_controller.stop_recording() # Stop safely
win.close()
core.quit()
# --- End Experiment ---
ET_controller.stop_recording() # Save remaining data and disconnect
win.close()
core.quit()