Artificial Intelligence and Non-verbal Communication : Reconfiguring Body Language

Overall Concept

Exploring the recognition, translation, and conception of body language, with particular interest in the “sigh.”

Data (Still in Consideration)

  • Video data from movie scenes / internet
  • Audio
  • Personal data
  • Air pressure

Do not yet know what data we need for disembodied body language.

Questions and Considerations

  • What is the machine version of “sighing”? (“sigh”-ing as the impetus of the project)
  • Conception of language : Is there a difference between non-verbal and verbal communication?
  • Can technology recognize the nuances in body language that we feel make us particularly human?

Points of Interest

  • Translation of human body language
  • All methods of expression we have is a result of our bodies. But the communication between software does not require haptic feeling.
  • Technology has sensory capabilities for humans; not for ‘themselves’

Bigger Picture

  • Reflecting on a modern age of communication that leans on emotions and body notation (text, emojis, etc.)
  • Interpersonal relationship between human and ai
  • Technology / bot that would communicate back to you with the patterns of body language

Outcomes

Success

  • If the user feels that they have garnered some sort of meaningful interaction

Failure

  • Random or arbitrary interactions