<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Real-Time Rendering &#187; Bungie</title>
	<atom:link href="http://www.realtimerendering.com/blog/tag/bungie/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.realtimerendering.com/blog</link>
	<description>Tracking the latest developments in interactive rendering techniques</description>
	<lastBuildDate>Sun, 12 May 2013 00:21:14 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.4.1</generator>
		<item>
		<title>Digital Foundry interview with Halo: Reach developers</title>
		<link>http://www.realtimerendering.com/blog/digital-foundry-interview-with-halo-reach-developers/</link>
		<comments>http://www.realtimerendering.com/blog/digital-foundry-interview-with-halo-reach-developers/#comments</comments>
		<pubDate>Fri, 17 Dec 2010 00:44:28 +0000</pubDate>
		<dc:creator>Naty</dc:creator>
				<category><![CDATA[Resources]]></category>
		<category><![CDATA[Bungie]]></category>
		<category><![CDATA[Halo: Reach]]></category>

		<guid isPermaLink="false">http://www.realtimerendering.com/blog/?p=1833</guid>
		<description><![CDATA[Halo: Reach was one of the big game releases of 2010, so I was pleased to see a detailed technical interview with some of the developers on Eurogamer&#8216;s Digital Foundry website. I recommend you read the whole thing, but I&#8217;ll summarize some of the notable rendering tidbits (the interview also covered multiplayer, AI, and animation): [...]]]></description>
			<content:encoded><![CDATA[<p><em><a href="http://www.bungie.net/projects/reach/default.aspx">Halo: Reach</a></em> was one of the big game releases of 2010, so I was pleased to see <a href="http://www.eurogamer.net/articles/digitalfoundry-halo-reach-tech-interview">a detailed technical interview</a> with some of the developers on <a href="http://www.eurogamer.net/">Eurogamer</a>&#8216;s <a href="http://www.eurogamer.net/digitalfoundry/">Digital Foundry website</a>. I recommend you read the whole thing, but I&#8217;ll summarize some of the notable rendering tidbits (the interview also covered multiplayer, AI, and animation):</p>
<ul>
<li>The previous two games (<em>Halo 3</em> and <em>Halo 3: ODST</em>) used a &#8220;semi-deferred&#8221; approach, not for deferred lighting or shading but for decals. It sounds like they rendered a cut-down g-buffer (probably just diffuse color) in the first geometry pass (skipping small decoration objects like grass and pebbles), then blended decals into this buffer, and finally rendered the geometry a second time to do the lighting. <em>Halo: Reach</em> changed to a deferred lighting approach. Some lights were deferred and some weren&#8217;t, objects without decals or deferred lighting were only rendered once (this &#8220;hybrid deferred lighting&#8221; sounds similar to what Naughty Dog used for the <em>Uncharted</em> series).</li>
<li><em>Halo 3</em> used spherical harmonics in lightmaps to store directional lighting information (detailed in <a href="http://www.bungie.net/images/Inside/publications/presentations/lighting_material.zip">a GDC 2008 talk</a>, as well as a SIGGRAPH 2008 course &#8211; see <a href="http://developer.amd.com/gpu_assets/S2008-Chen-Lighting_and_Material_of_Halo3.pdf">slides</a> and <a href="http://developer.amd.com/documentation/presentations/legacy/Chapter01-Chen-Lighting_and_Material_of_Halo3.pdf">course notes</a>). For <em>Halo: Reach</em>, Bungie developed an improved light map representation that gave them &#8220;the same support for area light sources, improved contrast, fewer artifacts, a smaller memory footprint and much better performance&#8221;. This sounds really interesting; I hope they will describe this further in a conference presentation or article.</li>
<li>They developed a particle system which performs scene collisions on the GPU, using the depth and normal buffers as an approximate scene description. it can do tens of thousands of collisions / bounces per frame in 0.3 milliseconds (their previous CPU-based colliding particle system had a budget of 7 collisions per frame!). This system will be <a href="http://schedule.gdconf.com/session/12127">presented at GDC 2011</a> (the presentation will also discuss optimizations to their atmospheric effects system).  This is a great idea &#8211; techniques like SSAO use depth/normal buffers as approximate scene descriptions for rendering, but this is the first time I have heard of this being done for simulation.</li>
<li><em>Halo 3</em> used two 8-bit-per-channel frame buffers with different exposure values for HDR effects (primarily bloom). Bungie described this scheme in a presentation called &#8220;HDR the Bungie Way&#8221; at two Gamefest conferences: <a href="http://download.microsoft.com/download/7/6/0/760ba04e-6952-4c14-a51e-fa54e02f3198/Graphics.zip">USA in 2006</a> and <a href="http://www.microsoft.com/downloads/en/details.aspx?FamilyId=995B221D-6BBD-4731-AC82-D9524237D486&amp;displaylang=en&amp;pf=true">Europe in 2007</a> &#8211; the 2006 (giant) zip file also contains an audio recording, but the 2007 one has more updated slides (including screenshots). The <a href="http://www.bungie.net/images/Inside/publications/presentations/lighting_material.zip">GDC 2008 talk mentioned above</a> also discusses this scheme briefly towards the end. In contrast, <em>Halo: Reach</em> uses a single 7e3 buffer; this yields higher performance and frees up more EDRAM for shadow buffers but has less dynamic range (the primary result of this is loss of color in some bloom regions).</li>
<li>Instead of MSAA, <em>Halo: Reach</em> uses a simple temporal anti-aliasing method. The camera is offset by a half-pixel in alternate frames, and the last two frames are selectively blended (the blending is turned off on pixels that have moved significantly since the last frame).</li>
<li>They developed a new LOD system (to be <a href="http://schedule.gdconf.com/session/12125">presented at GDC 2011</a>) which automatically generates low-cost models to be used far from the camera. Combined with improved occlusion culling and GPU occlusion queries, this enabled a significant increase in draw distance.</li>
</ul>
]]></content:encoded>
			<wfw:commentRss>http://www.realtimerendering.com/blog/digital-foundry-interview-with-halo-reach-developers/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>