13-10-2012, 12:54 PM
Enhancing Theater Performance Using Augmented Reality
Enhancing Theater Performance.pdf (Size: 493.96 KB / Downloads: 39)
ABSTRACT
In this paper, we present the first augmented reality system in the
Philippines used with a theatrical performance. Augmented
reality follows each main actor and intelligently combines active
graphics with the actor during the performance. Each main actor
is identified by a unique color which is integrated into his
costume. The computer system, with an attached video camera
capturing the performance, finds and tracks this unique color in
real time using histogram back-projection. The image location of
the color is then overlaid with graphics that enhance that actor's
character. The captured video of the performance, together with
the graphic enhancements, are then projected on a separate screen.
The augmented reality system was used in the UP Los Baños
Department of Humanities’ production of the stage play "Tao".
INTRODUCTION
Augmented reality deals with the combination of real world
events and computer-generated graphics or data. It is mostly
concerned with the use of real time video imagery which is
digitally processed and augmented with computer-generated
graphics. Unlike virtual reality, which takes the user in a
completely computer-generated environment, augmented reality
works by overlaying computer-generated graphics such as texts,
images, and other visual animations onto the real environment.
Despite the fact that augmented reality is still in its early stage of
research and development, it has already been used in various
domains such as in medicine, robotics, entertainment, engineering
and even in military purposes. Early application of augmented
reality have appeared in televised races and football games where
advertisements are superimposed on certain locations during live
coverage of these sporting events. Another simple application of
the technology in mass media is in news reporting where a
weather reporter stands in front of a blue or green screen on which
the animated weather graphics is overlaid. In the case of motion
pictures, special effects in motion pictures are so far done off-line
where the effects are integrated frame by frame only after the
scene is captured. In other words, augmented reality integrates the
desired computer-generated effect while the video is being
captured. There have been a few documented attempts to enhance
theatrical performances with digital effects by tracking certain
markers present on specific objects on stage and overlaying
graphics in real-time.
RELATED WORK
Many image features such as color, edge, optical flow and texture,
have been used to track objects depending on the application
domain. Of these, color has proven to be an efficient visual
feature for tracking objects particularly in real-time as
demonstrated in [9, 13].
While the RGB color space is most native to video capture
devices, it is very sensitive to variations in brightness and
intensity. Hayashi and Fujiyoshi [5] in particular attempt to get
around this difficulty by creating an RGB-illuminance space. As
an alternative to the RGB color space, other studies turned to
using other color spaces such as HSI and YUV where the
intensity component Y can be treated differently from the
chrominance U and V. In [9] for instance, Lee, et al. applied a
color modeling approach considering intensity information in HSI
color space using B-spline curves based on the fact that color
distribution of a single-colored object even in HS plane is not
invariant with respect to brightness variations.
Generally, the main concern in color-based tracking is the
development of a method that is capable of adapting to different
lighting conditions. These methods such as those in [4] are
referred to as color constancy or color normalization. Allen, et al.
in [1] propose a robust real-time tracking based on color
thresholding. The object to be tracked is a user-selected region in
the initial frame of the video on which foreground object color
cluster extraction using K-means algorithm and filtering via
foreground extraction mask is applied. Krantchenko [7] likewise
was able to formulate a procedure to detect dielectric objects
under non-white illumination, shadows, highlights and even
variable viewing and camera operating conditions.
RESULTS
The system was developed using a standard PC running Fedora
Core 7 and was written using C++. The system will run on any
Linux workstation with the standard Simple DirectMedia Layer
(SDL) packages installed.
Our system augments an actual theatrical performance with
realistic graphical overlays with the use of multiple color trackers
that are robust to illumination conditions. The input to the system
consists of a video stream of the actual performance and image
sequences of the graphical overlays.
The performance of any actor with a distinct costume color can be
correctly augmented with any graphical overlay that is available in
the system's directory of graphical overlays. The system has been
designed so that new overlays can be added with little
reconfiguration. Figure 6 and 7 show a series of video stills of the
actors/actresses captured during actual performance of the play
together with overlaid active graphics.
CONCLUSION and FUTURE WORK
We developed an augmented reality system used during a live
performance of a stage play. The system augments the graphics in
the performance by tracking colors integrated in the actors’
costumes in real-time and overlays different graphics on them to
enhance the characters portrayed by the actors. This is done by
capturing the live performance using a video camera and finding
and tracking the color that uniquely identifies each actor. Selected
active graphics are then overlaid on the image location of the
color and the resulting video is projected on a separate screen
located onstage.