•  
  •  
 

Journal of the Association for Technology in Music Instruction

Abstract

The authors present a computational interface designed to integrate affective responses arising from visual and aural stimuli into a unified audio-visual representation. This representation serves as a framework for applications in algorithmic composition and game development, facilitating the creation of cohesive, affect-driven multimedia experiences.

Share

COinS