Computer generated images (CGI)
Computer-generated imagery (CGI) refers to the creation of still or animated visual content through computer software. Initially developed in the 1950s for military and scientific applications, CGI was not originally intended for artistic purposes due to the perceived complexity of the technology. By the late 1960s, advancements allowed independent animators and television producers to experiment with CGI, although the technology was still limited and expensive. The 1980s marked a significant turning point, as improvements in computer technology made CGI more accessible and commercially viable—most notably with Pixar's innovations and Disney's release of *Tron*, which showcased CGI's potential.
Throughout the 1990s, the integration of CGI in films and television flourished, with notable examples including *Jurassic Park* and *Titanic*. The era saw CGI used not only to enhance visual storytelling but also to create entire animated feature films such as *Toy Story*, the first of its kind. As CGI became increasingly sophisticated, filmmakers began to blend real and digital worlds seamlessly, allowing for more imaginative narratives and visuals. Today, CGI has transformed the landscape of film and television, enabling the creation of vibrant, fantastical worlds that engage audiences in new and exciting ways. Despite its widespread use, many creators still value the energy of live settings, balancing CGI with traditional filmmaking techniques.
Subject Terms
Computer generated images (CGI)
Computer-generated imagery
CGI forever changed the ability of television and film to portray images and stories, while bringing attention to the special-effects industry.
Computer animation was first used in the 1950s when Bell Laboratories and other research centers used it for graphics in military, manufacturing, or applied sciences applications. Computer animation was not developed for artistic work, as it was believed to be too technical for such use. High-tech computer-graphics laboratories and computer-graphics experts began experimenting with CGI in the1960’s. In the late 1960’s and early 1970’s, CGI was used by independent animators and television for commercials and station logos, but the graphics technology was very limited. Early three-dimensional (3-D) computer animation and imaging systems only functioned on slow, costly mainframe computers, and the cost and limitation of the hardware restricted the use of computer graphics.
![CGI image of workmen with air filters used when cleaning up a building after fumigation. Anthony Appleyard at the English language Wikipedia [GFDL (www.gnu.org/copyleft/fdl.html)], via Wikimedia Commons 89112510-59175.jpg](https://imageserver.ebscohost.com/img/embimages/ers/sp/embedded/89112510-59175.jpg?ephost1=dGJyMNHX8kSepq84xNvgOLCmsE2epq5Srqa4SK6WxWXS)
During the 1970’s and 1980’s, computer technology became more practical and useful. The 1970’s saw a transformation of the technology and a lowering of the cost, so 3-D computer animation and imaging technology greatly progressed. In the 1980’s, CGI became an area of artistic and commercial appeal because of enhanced technology, more people trained on computer animation and imaging, and a larger market.
Pixar began pushing the limits of CGI technology with character animations in the mid- and late 1980’s. The efforts, however, did not draw major public attention until the release of Disney’s Tron in 1982, which relied on CGI scenes of the internal computer world. The first entirely computer-generated animation and longest-running sequence in a feature film was the Genesis effect in Star Trek II: The Wrath of Khan (1982). The first completely computer-generated character was the stained-glass knight in Young Sherlock Holmes (1985). Director Robert Zemeckis used CGI in the Back to the Future films (1985, 1989, 1990), and in Who Framed Roger Rabbit (1988) he integrated live action and animation. Throughout the 1980’s, however, CGI was used to support the story, not carry it.
CGI Becomes Widespread
A decrease in prices on computer technology and the increase in the hardware capabilities and power of computers in the 1990’s allowed more integration of CGI by visual professionals. In addition, as 3-D animation became more complex and varied, it greatly impacted television. In the1980’s and for part of the 1990’s, CGI was too expensive and time-consuming for television, but it was used in commercials, credit sequences, music videos, feature films, and video games. Television began turning to CGI in the mid-1990’s mainly for commercials, but the animation was not always a smooth fit and the audience knew that the images were playing with reality. The first mainstream 3-D computer animation for television appeared in commercials for the Coca-Cola polar bears and Babylon 5 television series, both in 1993. Steven Spielberg’s television series, The Young Indiana Jones Chronicles (1992-1993), used CGI to clone extras and put the characters into exotic locations.
In the early 1990s, directors Spielberg, Zemeckis, and James Cameron promoted the use of the new imaging technologies. Filmmakers believed that film is driven to create photorealistic imagery and that audiences want to believe in magic and just enjoy the story, not wonder at the effects. The creation or simulation of reality became the main emphasis, and improvements in the power of the CGI systems, graphic clarity, and resolution allowed directors unprecedented control over what the audience would see. Directors could also realize their dreams, no matter how fantastic or complex in terms of special effects. Science-fiction cinema was producing the most spectacular CGI films, and, by 1993, the ten highest-grossing films of all time had special effects.
Working with Industrial Light and Magic (ILM), the special-effects company for The Abyss (1989), Cameron used CGI in a way that was considered groundbreaking, demonstrating its dramatic and artistic potential. Cameron also created the first CGI main character in film, the T-1000, in Terminator 2: Judgment Day (1991). Cameron’s 1997 Titanic seamlessly integrated a digital world into live action and went on to become one of the highest-grossing films of all time.
Spielberg used CGI dinosaurs in combination with models and manipulated images for the extremely successful Jurassic Park (1993) and The Lost World: Jurassic Park (1997). Zemeckis inserted a live character into historical films and manipulated historical figures in Forrest Gump (1994). His 1997 Contact used CGI throughout the film in both obvious and subtle ways and is considered a milestone in the use of animation and CGI for telling a story. Toy Story (1995) was the first fully 3-D computer-animated feature-length film and was followed by the sequel, Toy Story 2, in 1999. By 1999, CGI effects were heavily used in films such as The Matrix, which sparked a mainstream interest in virtual reality, and in Star Wars: Episode I—The Phantom Menace, in which director George Lucas included fully created CGI characters, including the controversial Jar Jar Binks.
CGI in the Twenty-First Century
CGI continued to evolve in the 2000s. In 2001, Final Fantasy: The Spirits Within became the first theatrically released feature film to feature photorealistic CGI actors; each of its 141,964 frames took ninety minutes to render, and the final product took 15 terabytes of memory. However, the film, which cost $150 million to make, was not successful critically or commercially. Peter Jackson's Lord of the Rings trilogy was considered noteworthy for its extensive use of CGI for its various fantasy creatures; in 2002, Andy Sirkis, who played Gollum in the films, became the first actor nominated for the new Critics' Choice Movie Award for best digital acting performance. It also used artificial intelligence to guide CGI characters in its large-scale battle scenes.
James Cameron's Avatar, released in 2009, featured motion-captured actors, using cutting-edge facial capture technology, in a fully CGI photorealistic environment. The movie was a resounding success at the box office and its visuals were widely praised as beautiful and immersive. To create the movie's detailed animated world, Cameron created the first digital art department. The same year, Up became the first computer-animated feature to be nominated for the Academy Award for best picture. In 2010, Toy Story 3 became the first feature-length CGI film to gross more than $1 billion.
However, audiences also began to tire of CGI in the 2010s, with some complaining of its overuse, particularly in science fiction, fantasy, and superhero movies. These complaints contributed to the lukewarm critical reception of films such as Peter Jackson's Hobbit trilogy—and to the popularity of films that forgo heavy CGI use, such as the practical-effects-heavy Mad Max: Fury Road (2015).
Impact
CGI challenged how thoughts and ideas are communicated in visual forms. By the end of the 1990s, filmmakers were using the technology to create interpretations of reality. There was an explosion of productions with high-quality, creatively diverse 3-D computer animation. Science fiction involved the public again in imagining other worlds and creatures and in looking for a future that may be more fantastical than had been imagined in a long time.
Even though CGI techniques have become cheaper and more accessible, directors and actors still prefer the live set energy combined with technology. Audiences now expect CGI usage and cannot always tell when it is used because the integration has become so seamless.
Bibliography
Britton, Peter. “The WOW Factor.” Popular Science 243, no. 5 (November, 1993): 86.
Butler, Jeremy G. Television: Critical Methods and Applications. Mahwah, N.J.: Lawrence Erlbaum, 2002.
Craig, J. Robert. “Establishing New Boundaries for Special Effects: Robert Zemeckis’s Contact and Computer-Generated Imagery.” Journal of Popular Film and Television 28, no. 4 (Winter, 2000): 158.
De Semlyen, Phil. "A History of CGI in the Movies." Empire, 17 Dec. 2010, www.empireonline.com/movies/features/history-cgi. Accessed 21 Nov. 2016.
Lowry, Brian. "'Avengers' and the Age of CGI Overkill in Hollywood." Variety, 5 May 2015, variety.com/2015/film/news/avengers-age-of-ultron-cgi-special-effects-1201487125.
Kerlow, Isaac V. The Art of 3D Computer Animation and Effects. 3d ed. Hoboken, N.J.: John Wiley & Sons, 2004.
Pierson, Michele. Special Effects: Still in Search of Wonder. New York: Columbia University Press, 2002.
Whissel, Kristen. Spectacular Digital Effects: CGI and Contemporary Cinema. Duke UP, 2014.