ICM FINAL - PROGRESS
For our final, Shivani and I will be working on exploring digital communication/body language. Because users are increasingly forfeiting real conversations in favor of texts or emails we wanted to explore what they are doing in order to express themselves. Based on our research (and personal experience) we found punctuation, capitalization and affective lengthening is used to convey tone, attitude and emotions. We also noticed existing programs such as Siri or Google Assistant lack this level of personalization, usually resulting in texts that don’t convey the desired emotion.
We decided to create a program that types the way the user sounds/wants to sound. We are using the speech recognition in p5 speech to record the words then pushing them into an array. Using p5 sound, we are analyzing the volume, frequency and amplitude to gauge the users emotion.
Based on this, it will apply the respective attributes (punctuation, capitalization, emojis and or affective lengthening) to generate a text that appropriately represents the emotion.
Here is the sketch so far: https://editor.p5js.org/aileenstanziola/sketches/BJRPuCnCm
We still need to:
Get p5 sound to log data until the last word is pushed to the array and stop after.
Study amplitude, volume and frequency in the data and map out the ranges.
Categorize the attributes accordingly and apply them using if’s statements.