Sighs Final Presentation
sighs
by Chelly Jin, Jack Turpin, and Youjin Chung
An exploration on how the analysis and application of nonverbal communication in computers possibly illuminate different perspectives in human/computer interfaces.
Collecting Data
We provided a list of 6 characterisitcs and prompted each person to sigh in a manner that they felt best identified or characterized these words.
The Characteristics
- dismay
- dissatisfaction
- boredom
- futility
- relief
- love lorn
We collected data from friends and acquaintances; however, a good portion of our data came from our Mechanical Turk Request. Our parameters in the Mechanical Turk Request:
Define
-
Open Frameworks to retrieve the audio data.
-
Wekinator to train the model Utilized wekinator to record the audios to train the model.
-
Processing to showcase the output.
Generate
Utilized Word2Vec and t-SNE Word embeddings. When given a word, it links the word to the closest characteristic (dismay, dissatisfaction, boredom, futility, relief, love lorn) and plays an audio snippets of the respective characteristic.
This is a screenshot of the audio sample folder:
Utilizing Jupyter to train the model:
These are the trained text: