Category Archives: Miscellaneous

Two virtual conferences next week: SCA and GTC

Two virtual conferences start this coming week of October 5th:

  • SCA 2020, about computer animation, starts Tuesday October 6th
  • NVIDIA’s Fall GTC 2020, about, well, NVIDIA stuff, starts with a keynote at the end of Monday, October 5th with sessions in earnest the next day

I’ll “attend” both, since they’re both free to me. I’ve never been to SCA – animation is not something I know all that much about (hand-wave about skinning, morph targets, IK, simulation – you now know all I know), but the virtual format gives me an excuse to dip in, especially for the keynotes. The NVIDIA GTC conference has lots of ray-tracing related talks, so I’ll tune in for them. GTC is free to those with .edu and .gov email addresses (and NVIDIA employees).

SCA is also using Discord, like I3D did, and they kindly provide a short introductory video on how to use it. Even without registering, you can download and watch most of the full paper presentations right now, if you like. During the conference itself, 3-minute summary videos will be played for the papers in a session and then discussion will commence. I’ll be interested to see how this format feels. My attention span is short, as is everyone’s (20 minutes is about the absolute maximum), but 3 minutes I can do.

I admit I’ll likely be multitasking, listening and watching only when something catches my attention, so probably will absorb only a small bit of what I’d get if I attended in person. I’m also unlikely to do my homework, reading the papers and watching the presentations, as I’ll mostly be lurking. That’s one advantage of physical conferences: You’re there, and so are “forced” to pay attention. Yes, you can fiddle with your phone or laptop in a “real” talk, but it’s a bit rude and unseemly – why are you even attending? (Well, the answer is, for single-track conferences, that you went to the conference for three sessions in particular, but have time to blow during the other five sessions. I don’t have a great answer for that problem, and admit to the same – it’s hard to pay attention for 6+ hours each day.)

Anyway, take advantage of these *** times and try things out! Figuring out what makes a virtual conference work well and gets people engaged is going to be important for about the next year, sadly enough. But it’s also a chance to find what works well, uses people’s time most effectively and efficiently, and how people who can’t afford the time or money to attend a physical conference can still stay informed.

I3D 2020 starts this Monday

I3D 2020 (“ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games” for short) starts this Monday. Here are some pro tips.

  • If you want to integrate I3D 2020’s calendar with your own, see the instructions here.
  • The daily YouTube livestreaming links are in place.
  • Crowd-pleasing academic keynotes are from Ming Lin (University of Maryland) and Julien Pettré (Inria), and industry keynotes from Rachel Rose (ILM), Naty Hoffman (Lucasfilm), and David Morin (Epic Games). Expect Mandalorians. Late-breaking news: the ILM/Lucasfilm talk on Tuesday morning will be a live presentation only, not permanently recorded. Similarly, the Epic Games talk will be a live presentation only, as this will allow David to show some new content.
  • Paper presentations are queued up, and registered attendees will have access to the slides from these. If you don’t have an ACM Digital Library subscription, come Monday the papers themselves will be free to all to download for the week.
  • The VR posters space has opened up to also be the breaks and after-hours social room – works fine in your browser. You can visit any time (now, in fact), if you’re registered. Pick a nice avatar (I dibs the blocky fox).
  • Plan on the rest of Friday off: after the conference is officially over, we will coordinate Overwatch and Fall Guys gaming groups, or you can just invite other attendees yourself through Discord. Most attendee interaction this week will be through Discord (hey, “Games” is in the title of our conference), but you’ll need to register.

And, here’s the registration link (it’s free): https://bit.ly/i3d2020reg – make sure to copy (at least) the Discord information once you’ve registered

I3D 2020 VR poster space

Seven Things for August 27, 2020

Seven things for July 30, 2020

Well, I have about 59 things, but here are the LIFO bits:

Cancel your SIGGRAPH hotel; attend EGSR, HPG, and I3D 2020 virtually, free

SIGGRAPH: don’t forget to cancel your hotel reservation if you have one. Physical SIGGRAPH is gone this year, but your reservation lives on – I just noticed mine was still in the system. If your reservation is like mine, you’ll want to cancel by July 6th or pay a night’s fee penalty. SIGGRAPH is of course virtual this year. August 24-28 is the week scheduled – beyond this, I do not know.

EGSR: It’s happening this week. I should have posted this a few days ago, but it’s all on YouTube, so you haven’t actually missed anything.

The 31st Eurographics Symposium on Rendering will be all virtual and free to attend this year: watch it here. Talks can be watched live or afterwards on YouTube. Full conference program: https://egsr2020.london/program/

Registration is optional, but gets to access to the chat system where you can ask authors questions: https://egsr2020.london/. There’s also two virtual mixers, Wednesday and Thursday: https://egsr2020.london/social-mixers/ – now to determine what 12:00 UTC is… (ah, I think that’s 8 AM U.S. Eastern Time).

HPG 2020: The High-Performance Graphics conference will happen virtually July 13-16. It is free, though there is a nominal fee if you want to attend the interactive Q&A sessions.

I3D 2020: The Symposium on Interactive 3D Graphics and Games is virtual and free to all this year. Dates are September 14-18. Details will be posted here as they become available.

As usual, publications and their related resources for these conferences are added by Ke-Sen Huang to his wonderful pages tracking them.

I’ll update this post as I learn more.

Ray Tracing Gems 2 updated deadlines

I’ll quote the tweet by Adam Marrs (and please do RT):

Due to the unprecedented worldwide events since announcing Ray Tracing Gems 2, we have decided to adjust submission dates for the book.

Author deadlines have been pushed out by five months. RTG2 will now publish at SIGGRAPH in August 2021, in Los Angeles.

More info here.

To save you a click, here are the key dates:

  • Monday February 1st, 2021: first draft articles due
  • Monday March 22nd, 2021: notification of conditionally and fully accepted articles
  • Monday April 5th, 2021: final revised articles due

And if you’re wondering, SIGGRAPH 2021 starts August 1st, 2021.

The key thing in the CFP: “Articles will be primarily judged on practical utility. Though longer articles with novel results are welcome, short practical articles with battle-tested techniques are preferred and highly encouraged.”

It’s nice to see this focus on making the book more about “gems,” concise “here’s how to do this” articles. There are lots of little topics out there covered in (sometimes quite) older books and blogs; it would be nice to not have to read five different ones to learn best practices. So, please do go propose an article. Me, I’m fine if you want to mine The Ray Tracing News, Steve’s Computer Graphics Index, etc.

 

The Center of the Pixel is (0.5,0.5)

With ray tracing being done from the eye much more now, this is a lesson to be relearned: code’s better and life’s easier if the center of the pixel is the fraction (0.5, 0.5). If you are sure you’re doing this right, great; move on, nothing to see here. Enjoy this instead.

Mapping the pixel center to (0.5,0.5) is something first explained (at least first for me) in Paul Heckbert’s lovely little article “What Are the Coordinates of a Pixel?”, Graphics Gems, p. 246-248, 1990.

That article is hard to find nowadays, so here’s the gist. Say you have a screen width and height of 1000. Let’s just talk about the X axis. It might be tempting to say 0.0 is the center of the leftmost pixel in a row, 1.0 is the center next to it, etc. You can even then use rounding, where a floating-point coordinate of 73.6 and 74.4 both then go to the center 74.0.

However, think again. Using this mapping gives -0.5 as the left edge, 999.5 as the right. This is unpleasant to work with. Worse yet, if various operators such as abs() or mod() get used on the pixel coordinate values, this mapping can lead to subtle errors along the edges.

Easier is the range 0.0 to 1000.0, meaning the center each pixel is at the fraction 0.5. For example, integer pixel 43 then has the sensible range of 43.0 to 43.99999 for subpixel values within it. Here’s Paul’s visualization:

OpenGL has always considered the fraction (0.5,0.5) the pixel center. DirectX didn’t, at first, but eventually got with the program with DirectX 10.

The operations for proper conversion from integer to float pixel coordinates is to add 0.5; float to integer is to use floor().

This is old news. Everyone does it this way, right? I bring it up because I’m starting to see in some ray tracing samples (pseudo)code like this for generating the direction for a perspective camera:

 float3 ray_origin = camera->eye;
 float2 d = 2.0 * 
     ( float2(idx.x, idx.y) / 
       float2(width, height) ) - 1.0;
 float3 ray_direction =
     d.x*camera->U + d.y*camera->V + camera->W;

The vector idx is the integer location of the pixel, width and height the screen resolution. The vector d is computed and used to generate a world-space vector by multiplying it by two vectors, U and V. The W vector, the camera’s direction in world space, is added in. U and V represent the positive X and Y axes of a view plane at the distance of W from the eye. It all looks nice and symmetric in the code above, and it mostly is.

The vector is supposed to represent a pair of values from -1.0 to 1.0 in Normalized Device Coordinates (NDC) for points on the screen. However, the code fails. Continuing our example, integer pixel location (0,0) goes to (-1.0,-1.0). That sounds good, right? But our highest integer pixel location is (999,999), which converts to (0.998,0.998). The total difference of 0.002 is because this bad mapping shifts the whole view over half a pixel. These pixel centers should be 0.001 away from the edge on each side.

The second line of code should be:

    float2 d = 2.0 *
        ( ( float2(idx.x, idx.y) + float2(0.5,0.5) ) / 
            float2(width, height) ) - 1.0;

This then gives the proper NDC range for the centers of pixels, -0.999 to 0.999. If we instead transform the floating-point corner values (0.0,0.0) and (1000.0,1000.0) through this transform (we don’t add the 0.5 since we’re already in floating point), we get the full NDC range, -1.0 to 1.0, edge to edge, proving the code correct.

If the 0.5 annoys you and you miss symmetry, this formulation is elegant when generating random values inside a pixel, i.e., for when you’re antialiasing by shooting more rays at random through each pixel:

    float2 d = 2.0 *
        ( ( float2(idx.x, idx.y) + 
                float2( rand(seed), rand(seed) ) ) /
            float2(width, height) ) - 1.0;

You simply add a random number from the range [0.0,1.0) to each integer pixel location value. The average of this random value will be 0.5, at the center of the pixel.

Long and short: beware. Get that half pixel right. In my experience, these half-pixel errors would occasionally crop up in various places (cameras, texture sampling, etc.) over the years when I worked on rasterizer-related code at Autodesk. They caused nothing but pain on down the line. They’ll appear again in ray tracers if we’re not careful.

Seven Things for April 17, 2020

Seven things, none of which have to do with actually playing videogames, unlike yesterday’s listing:

  • Mesh shaders are A Big Deal, as they help generalize the rendering pipeline. If you don’t yet know about them, Shawn Hargreaves gives a nice introduction. Too long? At least listen to and watch the first minute of it to know what they’re about, or six minutes for the full introduction. For more more more, see Martin Fuller’s more advanced talk on the subject.
  • I3D 2020 may be postponed, but its research papers are not. Ke-Sen Huang has done his usual wonderful work in listing and linking these.
  • I mentioned in a previous seven things that the GDC 2020 content for graphics technical talks was underwhelming at that point. Happily, this has changed, e.g., with talks on Minecraft RTX, World of Tanks, Wolfenstein: Youngblood, Witcher 3, and much else – see the programming track.
  • The Immersive Math interactive book is now on version 1.1. Me, I finally sat still long enough to read the Eigenvectors and Eigenvalues chapter (“This chapter has a value in itself”) and am a better person for it.
  • Turner Whitted wrote a retrospective, “Origins of Global Illumination.” Paywalled, annoyingly, something I’ve written the Editor-in-Chief about – you can, too. Embrace being that cranky person writing letters to the editor.
  • I talk about ray tracing effect eye candy a bit in this fifth talk in the series, along with the dangers of snow globes. I can neither confirm nor deny the veracity of the comment, “This whole series was created just so Eric Haines would have a decent reason to show off his cool glass sphere burn marks.” BTW, I’ll be doing a 40 minute webinar based on these talks come May 12th.
  • John Horton Conway is gone, as we likely all know. The xkcd tribute was lovely, SMBC too. In reading about it, one resource I hadn’t known about was LifeWiki, with beautiful things such as this Turing machine.

Seven Things for April 16, 2020

Here are seven things, with a focus on videogames and related things this time around:

  • Minecraft RTX is now out in beta, along with a tech talk about it. There’s also a FAQ and known issues list. There are custom worlds that show off effects, but yes, you can convert your Java worlds to Bedrock format. I tried it on our old world and made one on/off video and five separate location videos, 1, 2, 3, 4, 5. Fun! Free! If you have an RTX card and a child, you’ll be guaranteed to not be able to use your computer for a month. Oh, and two pro tips: “;” toggles RTX on/off, and if you have a great GPU, go to Advanced Video settings and crank the Ray Tracing Render Distance up (you’ll need to do this each time you play).
  • No RTX or home schooling? Try Minecraft Hour of Code instead, for students in grades 2 and up.
  • There’s now a minigame in Borderlands 3 where you solve little DNA alignment puzzles for in-game bonuses. The loot earned is absurdly good at higher levels. Gearbox finally explained, with particularly poorly chosen dark-gray-on-black link text colors, what (the heck) the game does for science. It seems players are generating training sets for deep learning algorithms, though I can’t say I truly grok it.
  • Beat Saber with a staff is hypnotic. You can also use your skills outside to maintain social distancing.
  • A few Grand Theft Auto V players now shoot bullets to make art. Artists have to be careful to not scare the NPCs while drawing with their guns, as any nearby injuries or deaths can affect the memory pool and so might erase the image being produced.
  • Unreal Engine’s StageCraft tech was used to develop The Mandalorian. I’m amazed that a semicircular wall of LED displayscould give realistic backgrounds at high enough resolution, range, and quality in real time. It has only 28 million pixels for a 270 degree display, according to the article – sounds like a lot, but note a single 4K display is 3840 * 2160 = 8.3 million pixels.
  • Stuck inside and want to make your housing situation an infernal hellscape, or at least more of one? Doomba‘s the solution. It takes your Roomba’s movement information and turns it into a level of classic Doom.

Made it this far? Bonus link, since it’s the day after U.S. taxes were due, but now deferred until July 15th: Fortnite virtual currency is not taxable.