emotionthingy.png

Emotify

emotionthingy.png

Song recommendations based on facial emotion detection.

How could music services improve your life if they knew when you were feeling sad or down, anxious, stressed, or happy?

 

Emotify is a project that investigates what it would be like if music companies like Spotify had access to your emotion data while you were listening to music. The final prototype manifests itself in the form of a desktop app that recommends songs based on the user's emotion detected from webcam capture.

 
 The many faces of Jenni.

The many faces of Jenni.

 

Role: UX Design, UI Design, Programming
Tools Used: Electron, Node.js, HTML/CSS, JavaScript
Final Prototype: Interactive Electron app

 

Project Context

What could services do if they knew when you were feeling sad or down, anxious, stressed, or happy? What would it be like if music companies had access to your emotion data while you were listening to music?

Maybe they would be able to figure out which songs bring you joy, or how music affects your mood. Having access to each user's emotion data could provide insights into invisible patterns.

In this project, I explore these questions and make a final application that uses facial recognition in order to recommend the user a song.

 

Exploration

If companies had access to your facial emotion data while you were listening to music, they would be able to tell what songs make you happy.

They would be able to tell how each song affects your mood, and construct a emotion timeline of your day.

They might be able to tell which parts of a song bring you most joy, or how you feel throughout the course of the song.

 
 Speculative emotion timeline based on what the user is listening to.

Speculative emotion timeline based on what the user is listening to.

 
 Speculative timeline highlighting areas the user felt most happy while listening to a song.

Speculative timeline highlighting areas the user felt most happy while listening to a song.

 

Implementation

For this project, I used facial recognition (webcam) as my feed, paired with the Spotify recommendations API in order to find songs based on the emotion detected in your face. I imagine this as one way for music services like Spotify to use your emotion data, but there are other ways --- music companies may buy your emotion data to make songs that bring the most joy (catchy songs, perhaps).

 
 

Demo of the working application -- further steps would be to code a smoother UI. This was a rapid prototype I programmed in the short timespan that we were given.

Reflections

This was my first time working with Electron, and it was really fun building out the interactive app! When I tested it on my classmates, everyone loved it and wanted to use it for their own parties or family events. Although unintended, the idea of a shared DJ playlist based on everyone's emotions sounds quite appealing. My next steps would be to iterate and refine the user experience and user interface design, and do more testing!