Computer Animation/Computer Generated Imagery

Computer Generated Imagery

edit

Computer-generated imagery (CGI) is the application of the field of computer graphics (or more specifically 3D computer graphics) to special effects. CGI is used in movies, television programs and commercials, and in printed media. Real-time computer graphics, such as those in video games, are rarely referred to as CGI.

CGI is used because it is often cheaper than physical methods, such as constructing elaborate miniatures for effects shots or hiring a great deal of extras for crowd scenes, and because it allows the creation of images that would not be feasible using any other method. It can also allow a single artist to produce content without the use of actors or other contributors to the project.

2D CGI was first used in movies in 1973's Westworld, though the first use of 3D imagery was in its sequel, Futureworld (1976), which featured a computer-generated hand and face created by then University of Utah graduate students Edwin Catmull and Fred Parke. The first two films to make heavy investments in CGI, Tron (1982) and The Last Starfighter (1984), were commercial failures, causing most directors to relegate CGI to images that were supposed to look like they were created by a computer. Photorealistic CGI did not win over the motion picture industry until 1989, when The Abyss won the Academy Award for Visual Effects. Industrial Light and Magic produced photorealistic CGI visual effects, including a seawater creature lovingly dubbed the water weenie, for the film.

2D CGI increasingly appeared in "traditional" animated films, where it supplemented the use of hand-illustrated cels. Its uses ranged from digital tweening motion between frames, to eye-catching quasi-3D effects such as the ballroom scene in Beauty and the Beast.

In 1995, the first fully computer-generated feature film, Pixar's Toy Story, was a resounding commercial success. Additional digital animation studios such as Blue Sky Studios (Fox) and Pacific Data Images (Dreamworks SKG) went into production, and existing animation companies such as Disney began to make a transition from traditional animation to CGI.

Between 1995 and 2005 the average effects budget for a wide-release feature film skyrocketed from $5 million to $40 million. According to one studio executive, as of 2005, more than half of feature films have significant effects.

In the early 2000s, computer-generated imagery became the dominant form of special effects. The technology progressed to the point that it became possible to include virtual stunt doubles that were nearly indistinguishable from the actors they replaced. Computer-generated extras also became used extensively in crowd scenes. The timeline of CGI in movies shows a detailed list of pioneering uses of computer-generated imagery in film and television.

CGI for films is usually rendered at about 1.4–6 megapixels. Toy Story, for example, was rendered at 1536 × 922. The time to render one frame is typically around 2–3 hours, with ten times that for the most complex scenes. This time hasn't changed much in the last decade, as image quality progressed at the same rate as improvements in hardware.

Developments in CGI technologies are reported each year at SIGGRAPH, an annual conference on computer graphics and interactive techniques, attended each year by tens of thousands of computer professionals.

Developers of computer games and 3D video cards strive to achieve the same visual quality on personal computers in real-time as is possible for CGI films and animation. With the rapid advancement of real-time rendering quality, artists began to use game engines to render non-interactive movies. This art form is called machinima.

References

edit
edit