In the afternoon of Monday August 24, a number of tutorials will be offered. Descriptions of the tutorials can be found below.
Using Annotations for Virtual Human Research
By: Dennis Reidsma (University of Twente)
People who work with virtual humans often need to deal with manual annotation of human behavior, for various reasons.
Annotations of human interaction can tell us what a VH should do in certain situations, and what the correct form and timing is for certain actions. Annotations of humans interacting with a VH can help us analyse the experience of
people interacting with the VH, and show us how people respond to the VH in certain situations.
In this tutorial we give a basic introduction to working with annotations: how to make them, how to ensure their quality, and how to use them. We will look at tools that you need to collect the videos and annotate them, discuss the challenges of developing an annotation scheme, and look at the analysis of agreement and reliability, covering some typical pitfalls and do's and don'ts along the way.
Social Signal Interpretation for Virtual Agents: Basic Concepts and Implementation in the SSI Framework
Embodied conversational agents need to be aware of what is going on around them – in particular of how human interlocutors react to them on a socio-emotional level. To adapt adequately to human interlocutors, an agent has to analyze human behavioral data captured by a variety of sensory devices, such as cameras, microphones or depth sensors. To this end, we developed the Social Signal Interpretation framework (SSI), an open-source tool which enables the easy integration of sensory devices and social signal processing and interpretation techniques with an XML scripting editor. In this tutorial we give a practical introduction on how to capture, process and interpret sensor data in real-time and how to feed relevant socio-emotional findings into agent-based applications. We give an introduction on the general concepts of social signal processing and relevant machine learning approaches as well as their practical implementation within the SSI framework. Interested participants will be assisted in a hands-on session in creating a full XML pipeline to recognize multimodal human behaviors in real-time, and making an agent react to such behaviors. For participants who are interested in taking part in the hands-on exercise, the software will be made available ahead of the tutorial (requires Windows 7 or 8). However, the tutorial will also be of value to those just attending. The content of this tutorial is intended for educating practitioners and aspiring researchers.
Design and Use of Questionnaires in HCI: An introductory tutorial
Questionnaires are commonly used in Virtual Agent studies. Perhaps you have wondered in your research what questionnaire would be the best suited, where to find it, or how to adapt an existing one. In this primer, we introduce the novice to the dos and don¹ts and point to some new state-of-the art methods. You will be able to put forward your own question and case studies.
Introduction to the Virtual Human Toolkit
By: Arno Hartholt
In order to reach the full potential that intelligent virtual agents offer, they not only require a range of individual capabilities, but also for these to be integrated into a single framework that allows us to efficiently create characters that can engage users in meaningful and realistic social interactions. This integration requires in-depth, inter-disciplinary understanding few individuals, or even teams of individuals, possess. The ICT Virtual Human Toolkit helps to address this challenge by offering a flexible framework for exploring a variety of different types of virtual human systems. Due to its modularity, the Toolkit allows researchers to mix and match provided capabilities with their own, lowering the barrier of entry to this multi-disciplinary research challenge.
This tutorial focuses on 1) the general architectural concepts supporting the Toolkit, 2) how to install and run the Toolkit main example, 3) how to create your own basic character, and 4) pointers on how to extend the Toolkit and/or use its capabilities within your own work. You can request the Toolkit download beforehand here: https://vhtoolkit.ict.usc.edu/download/