Chapter 1. Introduction, Concepts, And Overview

Virtual Choreographer (VC) is an XML description language for animated graphical and audio scenes. Using the same purposes as X3D, VC is designed to specify complex geometrical scenes. Only a subset of the spectrum of the current state of the art in graphical rendering is covered in order to focus on specific features that we consider as important in artistic design:

VC is also an interpreter for its description language that can run on various platforms and that can render graphics and sounds through various tools. VC is designed for modeling both interactive and off-line scenes. At the graphical level, interactive scenes are rendered through OpenGL, possibly increased by specific graphic accelerator features such as vertex or fragment shaders. Off-line scenes, typically animated movies, can be output as Renderman or POVray files. At the sound level, the tool is less bound to specific sound players or synthesizers. The rendering of sonic objects is performed through UDP communication between a sound server and the VC client. The rendering of sonic objects has been successively implemented through a communication with Max-based sound generators: Max/MSP, JMax, and PureData. Provided that the necessary libraries are installed, VC can be compiled on Linux, Max OSX, or Windows.

1.1. An Overview of VC

1.1.1. Scene Graph Structure

Like VRML, VC uses an acyclic graph structure to describe 3D scenes. Non-terminal nodes (nodes with children) are either transformation or interpolation nodes. They can have one or more children and can be contained into other nodes (their parents nodes). Nodes that represent media objects are leaf nodes, and therefore cannot have children nodes. They are called leaf nodes.

1.1.2. Events and Scripts

Communications between VC, the users, and concurrent renderers, such as sound synthesizers, relies on a message passing mechanism. Events are

  • input events: events produced by user interaction or received through the network from other applications,

  • output events: events sent to the user or to external applications for additional rendering,

  • internal events: events exchanged between objects that can trigger complex chain reactions.

Message processing is described in scripts attached to scene nodes. Each message processing consists of a triggering event, and one or more actions associated with one or more target nodes. Nodes have states that can be used to control event triggering. A typical action is the modification of attribute values in the XML tags that characterize an object and its properties. Messages that receive external triggering events (that are not bound to a specific scene node) are automatically located at the scene root node.

Scripts allow user-defined modifications of the scene nodes and dynamic scene control. Events are stored in a FIFO queue and processed instantaneously in the order that they are generated. In order to reduce event processing overload, some automatically generated events, such as coordinates of sonic objects, are not systematically routed. Threshold values are defined so that such update events are only sent if their value differs by more than a preset threshold.

1.1.3. Interpolators

Interpolators are located at non-terminal nodes and apply in parallel to all the subgraphs of their location node. These subgraphs are expected to be isomorphic: they must have the same structure, transformations at their corresponding non-terminal nodes must have the same type, and media objects at their corresponding leaf nodes must also have the same type. Currently, interpolators cannot be nested: an interpolator cannot operate on an interpolator node.

Interpolators that are defined on non-terminal nodes are used to describe animations by blending transformations. (Animations can however also be defined along path-based transformations.) Animators defined on leaf nodes perform modifications of media objects by interpolating their attribute values. Interpolation of geometrical objects concerns their structures and their material properties, light source interpolation implements changes of lighting conditions, sound interpolation implements sonic blending, etc.

Interpolators describe a polyline in a n-dimensional space. It is defined by k key positions. It is walked along according to a schedule function.

Interpolators can also be defined on variables: scalars, tables, and matrices elements. In this case, there is no recursive percolation of interpolation since variables do not have children nodes.

1.1.4. VC Syntax

VC is an example of a markup language defined in SGML/XML. In the near future VC will be defined using the native DTD syntax of SGML and XML.

1.1.5. Distributed Scenes

As with any XML document, VC scenes can be located on different locations in the WWW. Through <XREF> links, external sub-scenes can be referenced inside a VC scene specification.

1.1.6. Windows and Views

VC scenes are displayed in possibly nested windows. Each window is associated with a user who is located at a viewpoint of the current scene. The user can change viewpoint as the scene evolves with time. Each user can also move inside the coordinate system of its current viewpoint. Viewpoint change and user motion are the two major modes for interactive navigation.