👁️
Eye-Tracking & Visual Attention
Investigating how visual attention mechanisms operate using advanced eye-tracking
technology and computational models based on Bundesen's Theory of Visual Attention (TVA).
Leveraging Tobii eye-trackers to capture gaze trajectories, fixation dynamics, and
Task-Evoked Pupillary Responses (TEPR) — then linking these signals to higher-level
cognitive processes such as reading fluency, perceptual load, and attentional capacity.
Gaze Analysis
TEPR
Tobii
Fixation
Visual Attention
🧮
Cognitive Modelling: TVA, Psychophysics & Bayesian Inference
Unifying Bundesen's TVA with Bayesian hierarchical modelling to quantify individual
differences in visual processing. Per-participant estimation of attentional parameters
(visual processing speed C, storage capacity K, threshold t₀)
using MCMC in PyMC. Psychophysical experiments examine how typography, font legibility,
and stimulus properties modulate perception — with posterior difference distributions,
Growth Curve Analysis (GCA), and LOO-CV for robust inference.
TVA
PyMC · MCMC
Psychophysics
GCA
Hierarchical Bayes
LOO-CV
📡
Multimodal Signal Processing
Designing signal processing pipelines that fuse heterogeneous physiological streams —
eye-tracking, pupillometry, EEG, and facial action units — into unified representations
of cognitive and affective state. Research covers time-series alignment, artifact
rejection, feature engineering, and the development of robust multimodal models for
real-world sensing contexts including reading, workplace monitoring, and HRI.
Central theme of the MSPARC workshop at IEEE ICASSP 2026.
Signal Fusion
EEG · Pupillometry
Feature Engineering
Time-Series
ICASSP 2026
🤖
Affective Computing & Human-Robot Interaction
Building emotionally intelligent machines that can perceive, interpret, and respond
to human affective states. Facial expression recognition (FER) with dimensional
valence/arousal modelling, adapted for socially assistive robots supporting citizens
with Traumatic Brain Injury (TBI). Includes work on spontaneous affect in naturalistic
environments (EU Horizon 2020 WorkingAge project, Cambridge AFAR Lab), where systems
must be robust to unconstrained lighting, occlusion, and real-world variability.
FER
Valence/Arousal
HRI
TBI
Affective AI
WorkingAge
🦾
Robotic Rehabilitation
Developing socially assistive and therapeutic robotic systems that support cognitive
and physical rehabilitation for individuals with neurological conditions. PhD research
focused on deploying facial emotion recognition within robot-assisted therapy for
citizens with Traumatic Brain Injury (TBI) — enabling robots to perceive patient
affective states and adapt therapeutic interactions in real time. Research intersects
clinical HRI, affective sensing, and the ethical deployment of autonomous systems
in healthcare and assistive technology environments.
Assistive Robotics
TBI Therapy
Social Robots
Neurological Rehab
Clinical HRI
🖥️
Human-Machine Interaction & Adaptive Interfaces
Studying how humans interact with intelligent systems — and using that understanding
to build interfaces that adapt in real time to the user's cognitive and emotional
state. Research spans gaze-contingent reading aids, attention-aware document
rendering, and closed-loop interfaces that leverage eye-tracking and pupillometry
to reduce cognitive load and personalise content presentation. Bridges HCI
methodology with physiological computing and AI-driven adaptation.
Gaze-Contingent UI
Adaptive Systems
HCI
Cognitive Load
Personalisation
🧬
Deep Learning & Computer Vision
Applying deep neural architectures to perception and cognition problems: convolutional
networks and transformers for facial action unit detection, expression recognition,
and affect estimation from video streams. Research emphasises domain generalisation,
transfer learning across affective datasets, and the design of lightweight models
suitable for real-time deployment in embedded and robotic systems. Also encompasses
BERT-based sequence models applied to gaze reading data for language identification.
CNNs · Transformers
BERT
Transfer Learning
Action Units
Real-Time CV
🌱
Human-Centred AI
Advocating for AI systems designed with human needs, values, and diversity at their
core. This thread runs through all my work — from building models that explain their
inferences in terms of measurable cognitive parameters, to designing experiments with
genuine ecological validity, to ensuring assistive technologies serve populations
with neurological and cognitive differences. Engages with transparency, fairness,
and the responsible deployment of AI in health, education, and workplace settings.
Explainable AI
Responsible AI
Accessibility
Fairness
AI Ethics
🌐
NLP & Gaze-Based Language Modelling
Combining natural language processing with gaze behavioural data to study reading
and language processing across populations. BERT-based classifiers applied to fixation
sequences for native language identification from reading patterns (NLIR-BERT), evaluated
on the MECO-L2 multilingual corpus spanning 13 languages. Explores how gaze reflects
lexical access, syntactic processing, and L2 reading proficiency — with Monte Carlo
simulation for robust small-sample statistical inference.
BERT · NLP
NLIR
MECO-L2
Multilingual
Reading Science