Zach Chalkley (2015448) Zach Chalkley

Leveraging Deep Neural Networks for the Creation of Expressive Musical Compositions

Project Abstract

The ability to create art has long been recognized as uniquely human, serving as a powerful means of communication and emotional expression. With the rapid advancements in machine learning, algorithms such as DALL.E, Midjourney, and Google’s MelodyRNN have demonstrated remarkable proficiency in generating art, blurring the lines between human and machine creativity. This project aims to leverage the capabilities of Recurrent Neural Networks (RNNs) to produce musical compositions capable of evoking specific emotional responses in listeners. By conducting a survey where participants classify the compositions based on their perceived sentiment, we seek to evaluate the effectiveness of the RNN model in capturing and conveying emotions through music. The unique proposition of this research lies in its focus on utilizing RNNs to create music tailored to evoke desired emotions, particularly within the context of movies and video games, where music plays a crucial role in shaping the emotional experience. This research contributes valuable insights into the intersection of AI and artistic creation, offering new possibilities for leveraging technology to enhance emotional storytelling in various media.

Keywords: Machine Learning, Deep Neural Network, Audio Generation

 

 Conference Details

 

Session: Poster Session A at Poster Stand 49

Location: Sir Stanley Clarke Auditorium at Tuesday 7th 13:30 – 17:00

Markers: Jens Blanck, Arno Pauly

Course: BSc Computer Science, 3rd Year

Future Plans: I’m looking for work