Computational Abstraction as Emotional Abstraction
Using CNNs to Explore ‘Human’ Expression and Emotion
Hillary, Zoe, and Vita
Some Central Questions:
- How do machines ‘understand,’ connect and convey abstract ideas?
- How can emotion, abstraction, expressiveness be translated through CNNs?
- How might these ideas convey notions of ‘humanness’ distinct to individuals through the use of different data sets that are user-specific?
Intended Outcome
a tool(?); perhaps in form of website, or a crowd-sourcing project, or an installation
Potential Learning Models etc.:
- Convolutional Neural Network
- Style Transfer
- DCGAN - Deep Convolutional Generative Adversarial Networks
- T-SNE for similar video / sound / word structures
Data:
- Sound
- Hand-drawn drawings
- Photo datasets
- User input
Team Distribution:
We are all starting out by exploring different datasets, and it is likely that either out projects will converge into some final exploration. If they do not converge, than we will have an interesting conversation about how they are interconnected.
Success / Failure:
Our tool does not need to “solve a problem”. We would like it to create a meaningful output of some sort, but at the moment are fairly flexible in terms of what that can mean.