Software takes command



Download 0.68 Mb.
Page20/21
Date23.05.2017
Size0.68 Mb.
#18855
1   ...   13   14   15   16   17   18   19   20   21
.

100 http://en.wikipedia.org/wiki/Motion_graphics, acessed August 25, 2008.

101 For a rare discussion of motion graphics prehistory as well as equally rare attempt to analyze the field by using a set of concepts rather than as the usual coffee table portfolio of individual designers, see Jeff Bellantfoni and Matt Woolman, Type in Motion (Rizzoli, 1999).

102 http://en.wikipedia.org/wiki/Motion_graphic, accessed August 27, 2008.

103 Included on onedotzero_select DVD 1. Online version at http://www.pleix.net/films.html, accessed April 8, 2007.

104 Invisible effect is the standard industry term. For instance, the film Contact, directed by Robert Zemeck, was nominated for 1997 VFX HQ Awards in the following categories: Best Visual Effects, Best Sequence (The Ride), Best Shot (Powers of Ten), Best Invisible Effects (Dish Restoration), and Best Compositing. See www.vfxhq.com/1997/contact.html.

105 In December 2005, I attended the Impact media festival in Utrecht and asked the festival director what percentage of the submissions they received that year featured hybrid visual language as opposed to “straight” video or film. His estimate was about 50 percent. In January 2006, I was part of the review team that judged the projects of students graduating from SCI-ARC, a well-known research-oriented architecture school in Los Angeles. According to my informal estimate, approximately one half of the projects featured complex curved geometry made possible by Maya, a modeling software now commonly used by architects. Given that both After Effects and Maya’s predecessor, Alias, were introduced in the same year—1993—I find this quantitative similarity in the percentage of projects that use new languages made possible by these software quite telling.

106 For examples, consult Paul Spinrad, ed., The VJ Book: Inspirations and Practical Advice for Live Visuals Performance (Feral House, 2005); Timothy Jaeger, VJ: Live Cinema Unraveled, available from www.vj-book.com; and websites such as www.vjcentral.com and www.live-cinema.org.

107 http://en.wikipedia.org/wiki/Digital_backlot, accessed April 8, 2007.

108 http://en.wikipedia.org/wiki/Hypermodernity, accessed August 24, 2008.

109 David Bordwell and Kristin Thompson, Film Art: an Introduction, 5th edition (The McGraw-Hill Companies, 1997), p. 355.

110 This definition is adopted from Bordwell and Thompson, Film Art, 355.

111 In the case of video, one of the main reasons which made combining multiple visuals difficult was the rapid degradation of the video signal when an analog video tape was copied more than a couple of times. Such a copy would no longer meet broadcasting standards.

112 Jeff Bellantfoni and Matt Woolman, Type in Motion (Rizzoli, 1999), 22-29.

113 While of course special effects in feature films often combined different media, they were used together to create a single illusionistic space, rather than juxtaposed for the aesthetic effect such as in films and titles by Godard, Zeman, Ferro and Bass.

114 See dreamvalley-mlp.com/cars/vid_heartbeat.html#you_might.

115 Thomas Porter and Tom Duff, “Compositing Digital Images,” ACM Computer Graphics vol. 18, no. 3 (July 1984): 253-259.

116 I should note that compositing functionality was gradually added over time to most NLE, so today the distinction between original After Effects or Flame interfaces and Avid and Final Cut interfaces is less pronounced.

117 Qtd. in Michael Barrier, Oscar Fishinger. Motion Painting No. 1

118 Depending on the complexity of the project and the hardware configuration, the computer may or may not be able to keep with the designer’s changes. Often a designer does have to wait until computer’s renders everything in frame after she makes changes. However, since she has control over this rendering process, she can instruct After Effects to show only outlines of the objects, to skip some layers, etc. – thus giving the computer less information to process and allowing for real-time feedback.

While a graphic designer does not have to wait until film is developed or computer finished rendering the animation, the design has its own “rendering” stage – making proofs. With both digital and offset printing, after the design is finished, it is sent to the printer that produces the test prints. If the designer finds any problems such as incorrect colors, she adjusts the design and then asks for proofs again.



119 Soon after the initial release of After Effects in January 1993, the company that produced it was purchased by Adobe who was already selling Photoshop.

120 Photoshop and After Effects were designed originally by different people at different time, and even after both were purchased by Adobe (it released Photoshop in 1989 and After Effects in 1993), it took Adobe a number of years to build close links between After Effects and Photoshop eventually making it easy going back and forth between the two programs.

121 I say “original” because in the later version of After Effects Adobe added the ability to work with 3D layers.

122 If 2D compositing can be understood as an extension of twentieth century cell animation where a composition consists from a stack of flat drawings, the conceptual source of 3D compositing paradigm is different. It comes out from the work on integrating live action footage and CGI in the 1980s done in the context of feature films production. Both film director and computer animator work in a three dimensional space: the physical space of the set in the first case, the virtual space as defined by 3D modeling software in the second case. Therefore conceptually it makes sense to use three-dimensional space as a common platform for the integration of these two worlds. It is not accidental that NUKE, one of the leading programs for 3D compositing today was developed in house at Digital Domain which was co-founded in 1993 by James Cameron – the Hollywood director who systematically advanced the integration of CGI and live action in his films such as Abyss (1989), Terminator 2 (1991), and Titanic (1997).

123 Alan Okey, post to forums.creativecow.net, Dec 28, 2005 < http://forums.creativecow.net/cgi-bin/dev_read_post.cgi?forumid=154&postid=855029>.

124 http://www.adobe.com/products/illustrator/features/allfeatures/, accessed August 30, 2008.

125 Ibid.

126 In 1995, After Effects 3.0 enabled Illustrator import and Photoshop as comp import. http://en.wikipedia.org/wiki/Adobe_After_Effects, accessed August 28, 2008.

127 Although the details vary among different software packages, the basic paradigm I am describing here is common to most of them.

128 Gregg Lynn, Animate Form (Princeton Architectural Press, 1999), 102-119.

129 I am grateful to Lars Spuybroek, the principal of Nox, for explaining to me how the use of software for architectural design subverted traditional architectural thinking based on typologies.

130 In the case of narrative animation produced in Russia, Eastern Europe and Japan, the visual changes in a frame narrative were not always driven by the development of a narrative and could serve other purposes – establishing a mood, representing the emotional state, or simply used aesthetically for its own sake.

131 In the “Compositing” chapter of The Language of New Media, I have defined “stylistic montage” as “juxtapositions of stylistically diverse images in different media.”

132 Alan Kay and Adele Goldberg, “Personal Dynamic Media,” IEEE Computer 10, no. 3 (March 1977). My quote is from the reprint of this article in New Media Reader, ed. Noah Wardrip-Fruin and Nick Montfort (Cambridge, MA: MIT Press, 2003).

133 For technical details of the method, see the publications of Georgi Borshukov: www.virtualcinematography.org/publications.html.

134 Although not everybody would agree with this analysis, I feel that after the end of 1980s the field has significantly slowed down: on the other hand, all key techniques which can be used to create photorealistic 3D images have been already discovered; on the other hand, rapid development of computer hardware in the 1990s meant that computer scientists no longer had to develop new techniques to make the rendering faster, since the already developed algorithms would now run fast enough.

135 The terms “reality simulation” and “reality sampling” are made up by me; the terms “virtual cinema,” “virtual human,” “universal capture” and “virtual cinematography” come from John Gaeta. The term “image based rendering” appeared already in the 1990s – see the publication list at http://www.debevec.org/Publications/, accessed September 4, 2008.

136 Therefore, while the article in Wired which positioned Gaeta as a groundbreaking pioneer and as a rebel working outside of Hollywood contained the typical journalistic exaggeration, it was not that far from the truth. Steve Silberman, “Matrix 2,” Wired 11.05 (May 2003)

Download 0.68 Mb.

Share with your friends:
1   ...   13   14   15   16   17   18   19   20   21




The database is protected by copyright ©ininet.org 2024
send message

    Main page