Date of Award
Doctor of Philosophy
Aaron T. Buss
A. Caglar Tas, Daniela M. Corbetta, Jared M. Porter
Visual working memory (VWM) guides the motor system by temporarily keeping relevant information in mind. As an interface between perception and action, VWM plays a critical role in supporting goal-directed behavior. Research on the relationship between VWM and action has primarily focused on the effect of VWM on motor output. Traditional approaches index outcome responses, such as accuracy, but this practice provides limited information on underlying VWM processes. Conversely, the influence of action on VWM processes has received less attention and its neural correlates are not well understood. In this thesis, I examined VWM-action links using functional near-infrared spectroscopy (fNIRS) and mouse-tracking to record real-time trajectories of participants' motor responses. Experiment 1 aimed to understand the relationship between movement dynamics, VWM performance, and their associated neural activity in a standard change detection task. Experiments 2 and 3 focused on the effect of action on VWM encoding using change detection tasks that manipulated task-relevance of the action (Experiment 2) and action-relevance of the items held in VWM (Experiments 2 & 3). The results showed that action enhanced VWM encoding for action-relevant features but impaired memory for action-irrelevant features. Moreover, the frontoparietal VWM network was differentially associated with action-relevant and action-irrelevant memory performance. Together, these findings suggest a trade-off between action and VWM encoding, where the representations of action-relevant features are prioritized but action-irrelevant features are suppressed. These results support and expand on the motor-induced encoding effect, demonstrating how action enhances VWM encoding for features that are action-relevant.
Kinder, Kaleb Thomas, "Visual Working Memory Encoding and Action: An Investigation using fNIRS and Mouse-tracking. " PhD diss., University of Tennessee, 2023.