Army enlists Tinseltown
- By Dan Caterinicchia, Dan Caterinicchia, Dan Verton
- May 01, 2000
The Army has pulled together a team of cinematography experts from Hollywood
to help it harness the technology depicted in the hit movie "The Matrix"
and TV series "Star Trek: The Next Generation" for use in its own next generation
of training and simulation systems.
Lt. Gen. William Campbell, the Army's chief information officer, said
the Army is studying the feasibility of building a "holodeck," a cutting-edge
simulator that would use virtual cinematography and video game technology
to create realistic 3-D scenes of actual locations worldwide that soldiers
could use for training and mission rehearsal. "It allows you to go anywhere,
anytime," Campbell said.
Campbell, who spoke last week at the annual conference on Information
Assurance and Battlefield Visualization, sponsored by the Association of
the U.S. Army and Association of Old Crows, showed a video of the now-famous
"bullet time" sequences from "The Matrix," which starred actor Keanu Reeves
as Thomas "Neo" Anderson, a computer hacker who is trapped in a simulated
version of 20th-century life known as the Matrix. In the sequence, Reeves
is able to dodge bullets by bending backward.
Campbell said it is this type of photo-manipulation technology, as well
as the simulation technology depicted in "Star Trek: The Next Generation,"
that the Army hopes to use in its future holodeck.
James Heath, senior intelligence and technical adviser at the U.S. Army's
Land Information Warfare Activity, said visualization is key to the future
of the Army. "Not only will [the holodeck] happen, but it's really mandatory,"
The Defense Department's relationship with the entertainment industry
has been growing closer. Last year, the Defense Modeling and Simulation
Office and Paramount Digital Entertainment began work on adapting Hollywood
multimedia technology and movie storytelling skills to create realistic
simulations for military officers learning how to make better decisions
during international crises [FCW, Aug. 30, 1999].
Also last year, the Army signed a five-year, $45 million contract with
the University of Southern California to establish the Institute of Creative
Technologies, a center for researching applications to improve realism in
training simulators. Under that contract, the Army is expecting movie producers
and computer game makers to develop new and better technologies.
The institute is a key member of the "team of experts" tapped by the
Army to work with production designers from the entertainment industry and
the university. Paul Debevec a filmmaker, scientist and leader in image-based
modeling, rendering and lighting using photographs to simulate events was one of the first experts hired by the ICT. Film work
by Debevec, who is based at the University of California at Berkeley, helped
inspire some of the Academy Award-winning visual effects in "The Matrix."
"The holodeck is the Holy Grail of the institute," Debevec said. "It
will be a next-generation virtual reality simulation technique that will
make it possible for a person to go into a room or put on a headset and
really feel like they are in a different place. They will be able to see,
hear, touch and even smell everything. The terrain or environment will be
realistic, and eventually there will even be other characters to interact
with and teach and learn from."
One of Debevec's students, George Borshukov, served as a technical designer
for the "bullet time" sequences. Borshukov, who works in research and development
at Alameda, Calif.-based Manex Entertainment, the company that managed the
effects for "The Matrix," said the technology is almost mature enough to
produce a completely realistic virtual environment.
"If you work from photos of real environments, you can get 95 percent
realism, but that doesn't include people or dynamics," Borshukov said. "Photo-
realistic humans and other stuff is a little farther away, especially for
real time, which is what the Army would want."
Borshukov said applications using a photo-realistic base with real-time
interaction is probably five to 10 years away, "but the technology is already
there and there's already a plan of how to do it."
"Capturing people doing real things on film and stringing that together
with real environments [will be done]," he said. But "the ultimate goal
to simulate all the physics, as opposed to simply image-based rendering,
is 20 to 25 years away."