For Producing, Delivering, and Consuming Animated Content, The Polygon is The New Pixel
According to Statista , the size of the ‘animation’ market worldwide was $265 billion in 2019, which includes the video game industry, sized at $110 billion according to their own Study. By these measures at least, the linear (film and video) animation and digital effects market may be at least as large as video game market.
For Animation, 3D Content Creation Pipelines Can Reduce Costs and Improve Quality
The linear animation market is being driven by strong growth in OTT. (According to AlliedMarketResearch , the CAGR for OTT is 16.7% and will exceed $330 billion by 2025.) Revenues from Japanese animation industry alone exceeded $19 billion in 2019, driven largely by exports for OTT consumption.
The Japanese animation industry still depends largely on labor-intensive 2D animation which is produced by a workforce of low-wage artists that often earn barely enough to survive . However, a growing number of Japanese animation studios including Polygon Pictures, SOLA Digital Arts, and others are using modern 3D content creation software to lower the cost and improve the quality of animated content. These studios are disrupting the anime industry by producing content at a lower cost while retaining, and arguably improving on, the hand-drawn anime look. Moreover, these studios can also produce content having the more photorealistic appearance and fluid animation styles that are preferred by western audiences.
Instead of manually drawing pixels for each frame of the animation, 3D content creation software enables artists to produce richly detailed 3D polygon models which can be animated using more fluid motion effects and rendered to reproduce a hand-drawn or cel shaded look, or to create virtually any visual style. Using this approach to animation, the 3D textured polygons, not the 2D pixels, are the fundamental elements created by the artists.
Game Engines Can Further Improve Productivity and Quality for Animated Content Creation
The next stage of disruption in the linear animation industry may be the incorporation of game engine technology into the production pipeline. Animation and digital effects studio ILM TV (a division of Lucasfilm-Disney) is using Epic Games Unreal Engine to produce in-camera digital effects for The Mandalorian in real time, which allows rapid iterations on the final look of a scene, while preserving the high image quality expected by ILM. ILM earlier used Unreal Engine to produce the final render of the robotic k-2SO character for the film Rogue One: A Star Wars Story.
In 2017 Digital Dimension became the first animation studio to produce a weekly show, Zafari (distributed by Comcast NBC), that is completely rendered using the Unreal game engine, while Disney Television Animation is using Unity game engine for final renders of Baymax Dreams .
Digital Dimension has proven that the transition from non-real time rendering to real time lighting and rendering speeds production and improves quality by shortening iteration times, giving artists immediate feedback on their design decisions. This movement of 3D content from non-real time content creation systems to game engines is made easier because both systems use textured polygons as the fundamental artist-created data elements which are later rendered as video frames of pixels by either non-real time or real time (game engine) systems.
A New Type of Stream for Games and Interactive Animated Content
The use of game engines in the production of linear animation and digital effects creates the intriguing possibility of a new type of interactive content that is streamed as textured polygons to game consoles, PCs, or any mobile device that can run a game engine. By intelligently prefetching just the textured polygons (as well as animation, material, lighting and certain other data) that is needed by the game engine in the moment, such a stream could quickly deliver the data necessary to efficiently render animated content on the client device without the annoying download times typically associated with digital game delivery.
This type of stream will allow animated programs and digital effects sequences to optionally incorporate the full range of user interactivity enabled by game engines, while delivering a high-resolution presentation at high frame rates, all without video compression.
From a technical standpoint, this type of content stream could employ any CDN edge server to stream the content and thereby avoid the high operating cost of video-based cloud gaming systems which must employ expensive, specialized game servers. Also, by operating in a nominal prefetch mode, such a stream could overcome the latency effects that are inherent in video-based cloud gaming systems.
From an entertainment standpoint, such a stream would enable an entirely new type of content for cable and OTT audiences. Imagine interactive action adventures streamed through game console apps such as “Netflix Interactive” or “HBO Max Interactive” in which the user can lean back and enjoy the animated program or digital effects sequences at full, uncompressed 4k resolution and 60 frames per second, which is today routinely output by the game console’s HDMI 2.0 cable at 17.2 Gigabits (not megabits) per second. At any moment, the viewer could choose to pick up a game controller and explore a different narrative arc, customize/personalize a character, or instantly take up a challenge to participate in the action. In short, become more immersed and engaged in the story, setting, and the action through the interactive storytelling methods used by video game designers to make interactivity an inviting or even compelling element of modern entertainment.
An Emerging Software Stack for Interactive Content Streaming
Instant Interactive, a division of Primal Space Systems in Raleigh North Carolina, is pioneering the development of a game engine middleware protocol, GPEGtm (Geometry Pump Engine Group) for streaming interactive content to game consoles, PCs, mobile devices, and next-gen set-top boxes.
The GPEG protocol is game engine middleware that enables interactive streaming of game engine content over broadband connections. Unlike Stadia etc., GPEG streams pre-encoded game engine assets (clusters of triangles & micro-tiled textures, material data, blueprint, animation data etc.) to a remote game engine running on console, PC, or mobile device. This pre-encoded data can be stored and streamed from any CDN server using GPEG server software, without the need for expensive game server hardware. The GPEG client plugin enables an “always on” configuration of the client-side game engine which receives the stream packets to provide virtually instant access to interactive content without long download or level load periods; and without the high cost, compression, and latency issues inherent to video-based cloud gaming approaches.
GPEG uses innovative methods of conservative from-region visibility precomputation to pre-encode the content as granular visibility event packet data which can be interactively prefetched to the client game engine using certain navigation-driven prefetch methods to overcome network latency. The GPEG stream maintains a personalized, hyperlocal managed cache of data on the client game engine, with the server effectively assuming visibility culling, residency management, and asset streaming functions for the client game engine.
GPEG is being developed to enable game publishers, developers, and digital distributors to provide their customers with a better user experience by enabling virtually instant access and playability for game and VR content.
GPEG is also being built to enable new types of instantly interactive content for cable and OTT audiences. For animated programs and digital effects sequences that are produced using a game engine, GPEG will enable cable and OTT providers to instantly deliver compelling interactive experiences to the entertainment mainstream.
GPEG middleware will enable the textured polygon, and its attendant game engine data, to become the basis of streamed convergent media experiences in which the polygon is the new pixel.