Category Archives: Book

Bring On the Errors!

We just found out that we’re about due for a second printing of “Real-Time Rendering, 3rd Edition.” A new printing means we can correct small errors. So, please, let us know of any mistakes or glitches you’ve found in the book, no matter how minor. Our list of known corrigenda (fancy name for errors) is here, and the very minor errors here. Due date is April 13th.

Just Cause 2 makes 3

A demo of the game Just Cause 2 is available on Steam today. What’s interesting is that this is the third DirectX 10-only game to be released. There have been any number of DirectX 10 enhanced games, but until a few months ago there was just one DirectX 10-only game release, Stormrise, a mediocre game released in March 2009. Shattered Horizon then came out in November from Futuremark, who are known more for their graphics benchmarks. Just Cause 2 is a sequel, and distributed by a well-known publisher. Humus describes the logic in going DirectX 10-only.

I’m looking forward to see how DirectX 11’s DirectCompute gets used in commercial applications. Perhaps the day there’s a DirectX 11-only game of any significance is the day we need to start writing a fourth edition. Let’s see: DirectX 10 was released November 2006 with Vista, so it took about three and a quarter years for an anticipated game to be released that was DirectX 10-only (and even now it’s considered dangerous by many to do so). DirectX 11 was released in October 2009, so if the same rule holds, then we’ll need to start writing in February 2013. Pre-order today!

Even now, 13% of Steam gamers have only SM 2.0. Games like World of Warcraft and Left 4 Dead 2 don’t require more, for example. So what’s the magic percentage where the AAA games decide to set the minimum level to the next shader model? I don’t recall it being much of a deal between shader model 2.0 and 3.0 games; there was a little hype, but I think this was because going from SM 2.0 to 3.0 involved just a card upgrade, vs. an OS upgrade. Which is funny, in that an OS upgrade is usually cheaper than a new GPU, but I think it’s also because it’s more critical, like a heart transplant vs. a cornea transplant.

Poking around, I found the interesting graphs below. I’m sure games have been left off, and some are miscategorized, e.g. Cryostatis is the only one under SM 4.0, and it doesn’t require DirectX 10. But, let’s assume this data is semi-reasonable; I’m guessing the games are categorized more by a “recommended configuration” than a minimum. So Shader Model 1.x game releases (and remember, 1.x was pretty darn limited) peaked in 2006, 2.0 peaked in 2007 but outnumbered 3.0 until 2009. SM 3.0 hasn’t peaked yet, I’d say (ignore 2010 and 2011 graph values at this point, of course). Remember that SM 2.0 hardware came out around 2002, so it peaked 5-6 years later and still was strong 7 years later (and perhaps longer, we’ll see). SM 3.0 came out in 2004, and seems likely to continue to be strong through 2010 and into 2011. 4.0 came out in 2006, so I’d go with it peaking in 2011-2012 from just staring at these charts. Which entirely ignores the swirl of other data—Vista and Windows 7, Xbox trends, GPU trends, blah-di-blah—but it’ll be interesting to see if this prediction is about right. (Click on a graph for the lists of games for that shader model.)

Shader Model 1.x

Shader Model 2.0

Shader Model 3.0

Amazon Needs Programmers, We Suspect

… at least judging from an email received by Phil Dutre which he passed on. Key excerpt follows:

Dear Amazon.com Customer,

As someone who has purchased or rated Real-Time Rendering by Tomas Moller, you might like to know that Online Interviews in Real Time will be released on December 1, 2009.  You can pre-order yours by following the link below.

With a title-finding algorithm of this quality, Amazon appears to be in need of more CS majors.

Don’t fret, by the way, I’ll be back to pointing out resources come the holidays; things are just a bit busy right now. In the meantime, you can contemplate Morgan McGuire’s gallery of real photos that appear to have rendering artifacts or look like computer graphics. It’s small right now – send him contributions!

SMOG Results

My wife just told me about the SMOG readability formula, which is evidently widely used. “SMOG” stands for Simple Measure of Gobbledygook. It looks for the number of polysyllabic words (3 syllables or more) used in a document. The square root of the result of dividing the number of polysyllabic words by the number of sentences is used to derive a readability grade level; read more on Wikipedia.

I ran the calculator here on a few passages in our book (those without equations, which I thought would throw the calculator off): Deferred Shading, Fresnel Equation, Scene Graphs, and the final chapter. Scene Graphs was simplest, at 12.56, Fresnel hardest, at 14.1. On average the level was a bit above 13, meaning College Freshman level. Pieces such as this one weigh in at 17.12. I took a piece of text from Hearn and Baker’s old Computer Graphics, C Version, 2nd Edition, on Fractals, and it came up as 14.47. So our book’s no Hop on Pop, but it’s at least not horrifically hard and seems in the ball park for our target audience.

By the way, this post’s SMOG grade is 11.21.

Our book’s figures now downloadable for fair use

A professor contacted us about whether we had digital copies of our figures available for use on her course web pages for students. Well, we certainly should (and our publisher agrees), and would have done this awhile ago if we had thought of it. So, after a few hours of copying and saving with MWSnap, I’ve made an archive of most of the figures in Real-Time Rendering, 3rd edition. It’s a 34 Mb download:

http://www.realtimerendering.com/downloads/RTR3figures.zip

Update: preview and download individual figures on Flickr

Update: figures for the Fourth Edition are here.

This archive should make preparation a lot more pleasant and less time-consuming for instructors, vs. scanning in pages of our book or redrawing figures from scratch. Here’s the top of the README.html file in this archive:

These figures and tables from the book are copyright A.K. Peters Ltd. We have provided these images for use under United States Fair Use doctrine (or similar laws of other countries), e.g., by professors for use in their classes. All figures in the book are not included; only those created by the authors (directly, or by use of free demonstration programs, as listed below) or from public sources (e.g., NASA) are available here. Other images in the book may be reused under Fair Use, but are not part of this collection. It is good practice to acknowledge the sources of any images reused – a link to http://www.realtimerendering.com we suspect would be useful to students, and we have listed relevant primary sources below for citation. If you have questions about reuse, please contact A.K. Peters at service@akpeters.com.

I’ve added a link to this archive at the top of our main page. I should also mention that Tomas’ Powerpoint slidesets for a course he taught based on the second edition of our book are still available for download. The slides are a bit dated in spots, but are a good place to start. If you have made a relevant teaching aid available, please do comment and let others know.

Corrigenda

“Corrigenda” is a classy publisher’s word for “bugs,” but it also means listing fixes for these errors. Morgan McGuire and his students have been reading our book closely, and have found the first two significant errors in the 3rd edition. These errors and their corrections can be found on our corrigenda page.

Donald Knuth sends checks for $2.56 for each error found in his classic (but still being written) series “The Art of Computer Programming”; Sir James Murray, the editor of the first Oxford English Dictionary, was perhaps the first to reward readers in this way. Knuth has an even more lucrative/costly reward doubling scheme for errors found in his software, with the prize now locked at $327.68.

Tomas offered his students a piece of candy for each error they found in our second edition. I like the idea of rewarding readers in some way, beyond naming them on the page. We’ll think of something; suggestions? More important, have you found any bugs, large or small? Please do pass them on, as it helps everyone.

Tim Sweeney Interview

Tim Sweeney is a cofounder of Epic Games and lead developer behind the graphics engines for the Unreal series of games. Jon Stokes has a meaty interview with him, up on Ars Technica; go read it!

fourth edition might be C++?Tim talks about how the GPU has become general enough that we will soon be able to get away from rasterization as the only rendering algorithm. Back ten years ago, dealing with an API to do all interactive graphics was limiting. Widening it out with programmable shaders gives more flexibility, but at the cost of complexity of managing the programming environment. Nowadays you’re programming two separate computers that talk to each other. The shift to parallel programming is already a major change in how we need to think about computers, one that hasn’t become a core concept for most of us, yet (myself included; I’m doing my best to wrap my head around Intel’s Threading Building Blocks, for example). Doing such programming in a few different languages is a “feature” we’d all love to see go away.

With Larrabee, CUDA, and compute shaders, the trends of more flexibility continue, though in different flavors. It seems unlikely to me that the pipeline model itself for rendering will fade in popularity any time soon, though rasterization (traditional GPUs) vs. tiling (Larrabee, handhelds) will continue to be a debate. Tim mentions voxel rendering techniques (really, heightfield, in the old games) as something that died once the GPU took over. True. Such techniques are making a return on the GPU even today, via relief mapping and adaptive tessellation. We’re also seeing volume rendering by marching along rays; if an algorithm can be refit to work on a GPU, it will find some use.

So I agree, the increase in flexibility will be all to the good in letting programmers again do much more than render textured opaque triangles via a Z-buffer really fast and most everything else not-so-fast. Frankly, I believe much of the buzz about interactive ray tracing is more an expression of yearning by us graphics programmers that we could actually program again, vs. calling an API. The April Fool’s Day spoof about ray tracing in DirectX 11 fooled a number of people I know, I believe because they wished it were true. Having hacked my fair share of rendering algorithms, I certainly see the appeal.

I think Tim’s a bit overoptimistic on the time frame in which such changes will occur. First, everyone needs to get this future hardware. Sure, NVIDIA points out there are 70 million CUDA-capable graphics cards out there today, but no one is floating CUDA-based programs as alternative interactive renderers at this point (though NVIDIA’s experiments with CUDA ray tracing are wonderful to see). DirectX 9 graphics cards will be around for years to come. As significant, making such techniques part of the normal development toolchain also takes awhile. I think of how long normal (dot-product) bump mapping, introduced around 2001, took to become a feature that was used in games: first most GPUs had to support it, then tools had to generate and manage the maps, then artists had to be trained to use the tools, etc.

When the second edition of our book came out, it was a few hundred pages longer than the first. I held out the hope to Tomas that our third edition would be shorter. My logic was that, with programmable shaders coming to the fore, we wouldn’t have to cover all the little variants that were possible, but rather could just present pure algorithms and not worry about the implementation details.

This came true to some extent. For example, we could cut out chunks of text about extremely specific ways to efficiently compute the Fresnel term, or give examples showing how assembly instructions are packed together in a pixel shader. There was now plenty of space on the GPU for shader instructions, so such detail was nonsensical. It would be like a programming languages book listing all the programs that could be written in the language. We still do have to spend time dealing with the vagaries of the APIs, such as the relatively space-inefficient ways in which triangles are fed through the pipeline (e.g. a “compact” representation of a cube must use 24 separate vertices, when all that is really needed are 8 points and 6 normals).

Counterbalancing such cuts in text, we found we had many more algorithms to write about. With both the increase in abilities in each successive generation of GPUs and APIs, coupled with research into ways to efficiently map algorithms onto new architectures, the book became considerably longer (and certainly heavier, since each illustration’s atoms now each needed 3 bytes instead of 1). So, I’m not holding out much hope for a shorter edition next time around-there’s just so much cool stuff that we can now do, and more yet to come.

Incidentally, we had asked Tim for a pithy quote for our new Hardware chapter. He said he didn’t have anything, but passed on one from Bily Zelsnack. This quote was tempting, but instead we used it in our last chapter: “Pretty soon, computers will be fast,” which I just love for some reason. It may sometimes take 20 seconds to open a file folder on Windows today, but I remain hopeful that someday, someday…

At long last, in stock

Lately I’ve been looking at Amazon’s listing of our book daily, to see if it’s in stock. Finally, today, it is, for the first time ever, a mere 40 days after its release. Not our publisher’s fault at all (A.K. Peters rules, OK?), and the book’s not that popular (AFAIK); it evidently just takes awhile for the books delivered to percolate out into Amazon’s system. Amazon under-ordered, so I believe by the time the books they first ordered made it to the distribution centers, they were already sold out, making the book again out of stock. Lather, rinse, repeat. So maybe I should be sad that it’s now in stock.

Anyway, the amusing part of visiting each day has been looking at the discount given on the book. It’s nice to see a discount at all, as Amazon didn’t discount our previous book for the first few years. With the current 28% discount, it means our new edition is effectively $5 less than the previous edition’s original price. Which cheers me up, as I like to imagine that students are saving money; my older son will be in college next year, and any royalties I make from our book will effectively get recycled over the next four years in buying his texts. His one book for a summer course this year was a black & white softbound book, 567 pages, and cost an astounding (to me) $115, and that was “discounted” from $128.95. I’m now encouraging my younger son to skip college and go into the lucrative field of transistor repair.

Amazon’s discount has varied like a random walk among four values: 0%, 22%, 28%, and 33%. Originally, in July, it was list price, then the discount was set at 33% (so Amazon was paying more for the book than they were selling it for), then back to normal, then 33%. Around August 14th I started checking once a week or so and also looking at Associates sales (a program I recommend if you’re a book author, as it’s found money – it pays for this website). Again the book went back to no discount, then on August 20th started at 0%, went to 22% off, then 33% off, all in the same day. The next day there was no discount, then the day after it went back to 33%. August 28, when I checked again, it was at 22%, and this discount held through the end of the month. On September 1st it went up to 28% off, and there it’s been for a whole 9 days.

The oddest bit was that, in searching around for prices (Amazon’s is indeed the best, at least as of today), I noticed that the first edition of our book, from 1999, sells used for twice as much or more than our new book. Funny world.

By the way, if you are looking to write a book and want to understand royalties and going rates a little bit better, see my old article on this topic. Really, it’s not my article, it’s a collection of responses from authors I know. Some of it’s a bit confrontational and might make you a little paranoid, but I think it’s worth a read. If you’re writing technical books to get rich, you’re fooling yourself, but on the other hand there’s no reason to let someone take advantage of you. My favorite author joke, from Michael Cohen via John Wallace, is that there are dozens of dollars to be made writing a book, dozens I tell you. It can be a bit better than that if you’re lucky, but still comes out to about minimum wage when divided by the time spent. But for me it’s a lot more fun and educational work than flipping burgers, and the money is not why we wrote our book. We did it for the wild parties and glamorous lifestyle.

Update: heh, that didn’t last long. I wrote this entry Sept. 9th. As of the 10th, the book is (a) out of stock again and (b) down to a 2% discount. 2%?! Truly obscure.

SIGGRAPH 2008: The Authors Meet

All the work on the book was done remotely, via email and CVS. In fact, I had never met Tomas until this morning at SIGGRAPH. Here you can see all three of us, pleased as punch that the book is finally done. Left to right, Eric, Naty, Tomas:

(Eric here. I guess this is a tradition: Tomas and I didn’t meet until after the first edition was published.)

Amazon discount, SIGGRAPH booth time

The book’s not quite shipping yet, but at this point Amazon has it heavily discounted, 33% off. I’m happy about this, as it makes the book cheaper than the second edition, which wasn’t discounted at all by Amazon until recent years. The weird bit is that this discount was available a few weeks back, then was gone when I checked last weekend. Someone let me know today that it’s back, and I just ordered an extra copy (this discount is higher than my author’s discount at AK Peters). I’ve noticed a strong correlation between the discount’s availability and the humidity in Flagstaff multiplied by the average hourly meteor siting rate in Anchorage. In other words, I have no clue when someone will wake up at Amazon and realize they’re paying more for the book than they’re selling it for (it’s true: my publisher said so).

While I’m thinking of it: Naty and I will be at the AK Peters’ booth at SIGGRAPH from 12:30 to 1:30 pm on Wednesday.