This paper introduces a methodology for combining multi-channel psycho-physiological recordings of affective paradigms into a framework where the scientific results of such experiments are utilized in the human computer interaction context to model the computer's response based on the emotional context of the user and the situation. An affective protocol is described the results of which are expected to be combined with anthropomorphic avatars that enhance the man-machine interaction. The technological infrastructure of the later component is provided by means of XML specifications of signal descriptions and emotion recognition, as well as avatar behavior generator descriptions.