Emotion Markup Language

The W3C just completed the first draft of the Emotion Markup Language (EmotionML 1.0).

Um, why?

Use cases for EmotionML can be grouped into three broad types:

  1. Manual annotation of material involving emotionality, such as annotation of videos, of speech recordings, of faces, of texts, etc;
  2. Automatic recognition of emotions from sensors, including physiological sensors, speech recordings, facial expressions, etc., as well as from multi-modal combinations of sensors;
  3. Generation of emotion-related system responses, which may involve reasoning about the emotional implications of events, emotional prosody in synthetic speech, facial expressions and gestures of embodied agents or robots, the choice of music and colors of lighting in a room, etc.

If you’re still not getting the why, they have a list of 39 possible use cases. I’m wondering if it could be used for interactive fiction somehow?

I love crap like this!