Journal of the Association for Technology in Music Instruction
Abstract
The authors present a computational interface designed to integrate affective responses arising from visual and aural stimuli into a unified audio-visual representation. This representation serves as a framework for applications in algorithmic composition and game development, facilitating the creation of cohesive, affect-driven multimedia experiences.
Recommended Citation
Cestari, Caua and Tavares, Angelo Gabriel
(2025)
"Integrating Visual and Sonic Experiences through Modeling Affective Communication,"
Journal of the Association for Technology in Music Instruction: Vol. 5
:
No.
2
, Article 3.
Available at:
https://trace.tennessee.edu/jatmi/vol5/iss2/3
Included in
Composition Commons, Music Performance Commons, Other Music Commons