Category Archives: Resources

Ray Tracing Monday, Again

I enjoyed the previous ray tracing monday two weeks ago, so let’s do it again.

Since it’s been two weeks, two resources:

  • “Real-Time Ray Tracing” – this is a free chapter (number 26) of Real-Time Rendering that we just finished and is free for download. 46 pages with a focus on DXR, but also including a detour through everyone’s favorite topic, spatial data structures for efficient ray tracing. Tomas, Angelo, and Seb did the heavy lifting on this one over the past 3 months, I helped edit and added bits where I could.
  • Ray Tracing Resources Page – along the way we ran across lots of resources. With the chapter finished as of today, I was inspired to make a page of links, mostly things I don’t want to lose track of myself or found of historical interest. Please do feel free to send me awesome resources to add, etc.

One entertaining thing I ran across was an image in the gallery for Chunky, a quite-good custom path tracer for Minecraft worlds:

Cornell blox

Cornell blox

CC0 – Public Domain for the World

Steve Hollasch mentioned that there’s a “new” (well, new to me – it’s 9 years old) Creative Commons instrument, CC0. Their website has an explanation of the problem with trying to put something you made into the public domain, and how CC0 solves this. Open Knowledge International (no, I never heard of them, either) recommends it, which I’ll take as a good sign. I didn’t know of this CC0 beast, and suspect readers here don’t, either, so now you do. It’s mostly not a license, it’s a “dedication,” a way to ensure that something you created is considered unowned and free to reuse in every country.

If you want to make sure your code is properly credited to you, use something such as the MIT license instead, or some other (often more restrictive) choice. Creative Commons recommends not using their other licenses for code, but rather some common source code license. I’m assuming (it’s not super-clear) that CC0 is also fine for code you’re putting into the public domain. Update: aha, Steve Hollasch sent this follow-up link – CC0 can be applied to code, and that link shows you how to do it.

Update: I received a number of interesting responses from my tweet of this post. David Williams points out that CC0 is approved by the Free Software Foundation for putting code in the public domain, as it has a fallback license for countries where public domain is not recognized. Arvid Gerstmann notes that dual-licensing with CC0 and the MIT license may be an even better option, for those companies where the lawyers haven’t approved the lesser-known CC0 but have approved use of code with the MIT license.

I know this all sounds like “it doesn’t matter, I’m never going to enforce this,” but it does, sadly. With Graphics Gems we made up a license long ago (basically, “don’t be a jerk”) because someone was trying to sell his company to a larger firm, which had some software testing firm test the smaller company’s code for copyright infringements, and various bits of Graphics Gems code popped up. If CC0 had existed, or maybe even the questionable (and rude) WTFPL, we would have gone with that. Happily, there is now the CC0. (tweet)

Win a free copy of Real-Time Rendering, 4th Edition, and receive it someday…

Now that I’m back from SIGGRAPH, I can catch up on all the things. So here’s one: win a free copy of Real-Time Rendering, 4th Edition. Our publisher is giving away three copies, deadline to enter is August 31.

As far as actually receiving a copy of the book, well, if it’s any consolation, none of us authors have a physical copy at this point, either. Our publisher wrote on August 8th:

This reprint should be in the warehouse within the next 3 weeks.  I assume the fulfillment dept will give customer orders priority over author copies.

So it’s a case of the shoemaker’s children go barefoot. Amazon says the book’s back in stock on August 27th.

I do like that the first three chapters are free on Amazon, for Kindle, and Google Play, so I hope that will tide people over until these ship. That this much content was made free was unexpected, a happy decision on the publisher’s part. If you’re done with those chapters and still waiting, don’t forget to read Pete’s now-free books on ray tracing.

I got to see a physical copy of our book at SIGGRAPH, so know such things exist. I also bought the book on Kindle (which at first had some download problem on my iPhone and PC, but downloaded fine the next day) and Google Play (surprised to find it there; same price as Kindle, by some amazing coincidence), as I wanted to see if a layout problem in my local copy was present in the book (happily, it wasn’t – ahhh, the mysteries of LaTeX).

One of the best parts of SIGGRAPH was actually meeting my coauthors. The wild party on the yacht that night in Vancouver Harbor was really something, too, but then I realized I made that up.

Eric, Angelo, Naty, Seb, Tomas, and Michal; photo courtesy of Mauricio Vives

Ray Tracing Monday

OK, everything happened today, so I am believing the concept of time is no longer meaningful.

First, NVIDIA announced its consumer versions of their RTX ray tracing GPUs, which should come as a shock to no one after last week’s Ray Tracing Monday at SIGGRAPH. My favorite “show off the ray tracing” demo was for Battlefield V.

Then, this:

https://twitter.com/Peter_shirley/status/1029342221139509249

I love free. To get up to speed on ray tracing, go get the books here (just in case you can’t click on Pete’s link above), or here (our site, which shows related links, reviews, etc.). Then go to the SIGGRAPH DXR ray tracing course site – there’s even an implementation of the example that’s the cover of Pete’s first book.

Up to speed already? Start writing an article for Ray Tracing Gems. At SIGGRAPH we found that a few people thought they had missed the proposals deadline. There is no proposals deadline. The first real deadline is October 15th, for completed articles. We will judge submissions, and may reject some, but our goal is to try to work with any interested authors before then, to make sure they’re writing something we’ll accept. So, you can informally write and bounce ideas off of us. We avoided the “proposals” step in order to give people more time to write and submit their ideas, large and small.

BTW, as far as free goes, we’re aiming to make the e-book version of Ray Tracing Gems free, and also having the authors maintain reprint rights for their works.

All for now. Day’s not over yet.

One Last Link – RTR4 Reference Page “Done”

I finally finished the Sisyphus-like task of putting useful links for RTR4’s references. For this brief moment I think all the links on that page work – enjoy it for the few minutes it lasts (and feel free to send me fixes, though I may blithely ignore these for a bit, as I’m sick to death of this task – no mas!). At the top of the page I note some pleasant tools, such as the Google Scholar search button extension, which saved me a lot of copying and pasting titles.

Oh, also, the free, online Collision Detection chapter now has its hyperlinked bibliography up, too, along with the appendices.

I’m writing a post mostly because I found this oddity: The classic paper

Torrance, K., and E. Sparrow, “Theory for Off-Specular Reflection from Roughened Surfaces,” Journal of the Optical Society of America, vol. 57, no. 9, pp. 1105-1114, Sept. 1967

is not one Google Scholar knows about in English. It turns up one in Japanese, which was a surprise. Searching on Google as a whole, it turns out Steve Westin still has one squirreled away. Paper archiving is a house of cards, I tells ya.

Next task: work on our main page of resources.

Less Movable Targets

Here’s an update to my previous blog post, on the volatility of web links.

The Twitter post has a bunch of responses, with some useful tidbits in there. Some resources mentioned: HTML5 UP! for free CC templates; gamedev.net has been around for almost 20 years and can act as an archive, gamedevs.org keeps some old presentations around. Go three paragraphs down for some web hosting suggestions. The idea of using the archive.org link as the “real” link is clever (and a bit sad), but assumes archive.org will always be around. Note that publishers such as the ACM allow you to put your published articles up on your homepage, your institution’s website, and on non-commercial repositories. I’m not sure how entities such as ResearchGate (where I’ve seen a number of papers stored) fit into this picture – they appear to be for-profit, e.g., they sell advertising, so I don’t think they fall into any of the ACM’s categories. I appreciate their efforts, but am concerned that papers there may go away because ResearchGate hasn’t been challenged by the ACM or others. Again, long-term durability is a question.

Also see the comments after the original post. My comment on “The sooner these are displaced by open publications like the JCGT, the better” is that, in graphics, there are no other free (to both readers and authors) journals, at least none that I know about. arXiv maybe qualifies. Looking there today, this article seemed like a handy summary, pointing to some resources I hadn’t known of before. But, trying to go to a site they mention in their article, Chrome warns, “Attackers might be trying to steal your information from dgtal.org” – OK, never mind. There might be great stuff at arXiv, but it seems like a firehose (10 articles published in graphics in the last week), without serious peer review. Editorial filtering and peer review is worth a lot. I guess you might be able to use a strategy of putting your preprint at arXiv, sort of like ResearchGate but less questionable (arXiv is run by Cornell). This approach is underutilized within graphics, AFAIK: only 2 papers on our refs page are available this way, vs. 25 for ResearchGate. If someone wants to explain what I’m missing here, great! Update: the ACM now permits authors to put preprints on ArXiv.

Thanks to you all for the followups, and I find my thoughts about the same: corporations come and go, more quickly than we expect. While I have a lot of faith in various institutions, ultimately I think the entity that best looks out for my interests is me. Having my own domain and website is good insurance against the vagaries from change of job status, change of corporate services (or existence), and change of webmaster. Me, I’m a cheapskate: http://erichaines.com is just a subdomain of realtimerendering.com, of which I’m the prime webmaster; we also host a number of other groups as subdomains, such as the Advances in Real-Time Rendering course notes repository and Ke-Sen’s invaluable work tracking conference articles – doing so costs me no time or money, as others maintain them. So another option is to share a domain and host among a bunch of people.

Yes, your own website costs a little money (the price of two cups of Starbucks per month), but admit it: you pay more in a month for your smartphone and internet service provider than the yearly cost for a website. It’s a bit of effort initially to register a domain and set up a website, but once the template and blog are in place, you’re done. Write a new article or slide set, one that took you hours or weeks to create? It’s five minutes to add it to your web page and upload it. Morgan McGuire, Andrew Glassner, and I like bluehost. Sven Bergström likes digitalocean for $5/month hosting, and gives some setup and admin tips. His previous favorite was site5. Sebastian Sylvan likes nearlyfreespeech, which I hadn’t heard of and looks quite cheap for a personal site (like, possibly something like $3.65 a year (plus $12 per Gig stored, or maybe less – the pricing is not clear), with a free Gig download a day), assuming you’re not serving up huge files or don’t get popular; ijprest notes in the comments that Amazon’s S3 hosting is bare bones, just basic hosting, but about as cheap at nearlyfreespeech and is pretty much guaranteed to outlast you.

Update Nov. 2019: A few more options, just in case. Google Domains and Namecheap are cheaper still for domain name registration, with Namecheap sounding a bit less expensive (but we’re talking a few dollars a year here, tops). For free hosting, Github is another interesting option. The advantages include collaboration and automatic backup of any changes, a la Git. We use this for I3D, for example, with the site’s elements visible to all. For non-programmer-types there are plenty of other options.

Oh, and the presentation from 2012 I mentioned in my last post that is no longer available – dead link – is now available again, as Duncan Fewkes sent me a copy and Michal Valient gave me permission to host it. It’s now here – a few minutes work on my part.

Question for the day: if Gmail and Google Docs suddenly went away, would this cause a collapse that would take us back to the 1990’s, 1950’s, or would the loss kick the world all the way back to some time in the 1800’s? Just a thought, you might want to use Google Takeout or other backup method now and then. If nothing else, visiting your Google Takeout site is interesting in that you see the mind-boggling number of databases Google has in your name.

Moving Targets, and Why They’re Bad

Executive summary: if you write anything or show off any images, you should make a real website, both for yourself and for others. The followup post gives some resources for doing so.

We’ve been updating the Real-Time Rendering site (take a peek – you might at least enjoy the 4th edition cover). Today I’ve been grinding through updating URLs for the references in the book. Even though the book’s not yet out, you can see what articles we reference and jump to the article from this page.

Most of the articles can be found through using Google or Google Scholar. A few articles are trickier to find, or have a few URLs that are relevant – that’s the value I feel I’m adding by doing this laborious task. The other reason is for helping avoid link rot – I’ll explain that in a minute. Another is virus protection. For example, one blog URL, for the article “Render Color Spaces” by Anders Langlands, has had its domain anderslanglands.com (DON’T GO THERE (REALLY)) taken over by some evil entity in May 2018 and now leads to a page full of nastiness.

In going through our reference page today and adding links, doing so reminds me how tenuous our storage of knowledge is for some resources on the internet. Printed journals at least have a bunch of copies around the world, vs. one point of failure. I’ve noted this before. My point today is this: if you publish anything, go buy yourself a domain and host it somewhere (I like bluehost, as do Morgan McGuire and Andrew Glassner, but there are no doubt cheaper ways). All totaled, this will cost you maybe around $110 a year. Do it, if you care about sharing your work or are at all serious about your career (e.g., lose your job or want another? You now have a website holding your CV or work, ready to show). URLs have a permanence to them, vs. company-specific semi-hosting schemes such as Github or Dropbox, where the rules can and do change. For example, I just found a Github-based blog entry from Feb. 2017 that’s now gone (luckily still on archive.org). With some poking around, I found that the blog entry is in fact still on Github, but appeared to be gone because Github had changed its URL scheme and did not redirect from the old URL to the new one.

Once you have a hosted URL, look at how others arrange their resources, e.g., Morgan McGuire recently moved all his content from the Williams College website to his own personal site. Grab a free template, say from W3 Schools or copy a site you like. Put any articles or presentations or images or whatever that you want people to find on that site. Me, I’m old school; I use basic HTML with a text editor and FileZilla for transfers, end of story. Start a WordPress or other blog, which is then hosted on your site and so won’t die off so easily. Once you have a modest site up, you are now done, your contributions to civilization are available to everyone until you forget to renew your domain or pay for web hosting. Assuming you remember, your content is available until you’re both dead and no one else keeps up with the payments (another good reason to renew for the longest duration). Setting up your own website isn’t some ego-stroking thing on your part – some of the rest of us want continued access to the content you’ve provided, so please do keep it available. If your goal in writing is to help the graphics community, then allow your work to live as long as possible. “But my blog posts and whatnot have a short freshness ‘read by’ date,” you complain. Let us decide that; as someone who maintains the Graphics Gems repository, a collection of articles from 1990-1995, I know people are still using this code and the related articles, as they report bugs and errata to me. “I have tenure, and my school’s been around for 200 years.” So when you retire, they’re going to keep your site going?

Most of us don’t grab a URL and host it, which is a pity for all concerned. Most of the links I fixed today rotted for one of three reasons: the site itself died (e.g., the company disappeared; I now can’t find this talk from 2012 anywhere, and at least 14 other sites link to it), the subdirectory on the site was deleted (e.g., for a student or faculty member no longer at the institution), or the URLs were reorganized and no redirection was put in place (and if you’re a webmaster, please don’t do this – take the time to put in some redirection, no matter how untidy it may feel to you). Some resources that still work are hanging on by a thread, e.g., three articles on our page are served up by FTP only. FTP! Update: see my follow-up post for where to find that 2012 talk now.

BTW, people have worked on how to have their sites outlive them, but so far I don’t know of a convincing system, one where the service itself is likely to outlast its participants. Some blog and presentation content does outlive its creator, or at least its original URL, as much of the internet gets archived by The Wayback Machine. So, for the virus-ridden anderslanglands.com site, the article I wanted to link to is available on archive.org. Jendrik Illner does something for his (wonderful) summary posts that I hadn’t seen before: each link also has a “wayback-archive” link for convenience, in case the link no longer works. You can also easily try such links yourself on any dead site by using this Chrome extension. With this extension active, by default a dead page will cause the extension to offer you to look on archive.org. Links have an average life of 9.3 years before they rot, and that’s just the average. You’re likely to live longer, so do your future older self a favor by saving them some time and distress: make a nice home for your resources now so you don’t have to later.

If you’re too busy or poor to host your own content, at least paste your important URLs into archive.org’s site (you can also use the “Save Page Now” option in the Chrome extension, if you have a lot of pages) and your content will get archived (though if it’s a large PDF, maybe not). However, content on archive.org is not included in Google searches, so articles there effectively disappear unless the searcher happens to have the original URL and thinks to use the Wayback Machine. Also, people may stop looking when they try your original URL and find, for example, a porn site (e.g., this archive.org graphics site’s original URL goes to one now). This won’t happen if you have your own URL and maintain it.

For longer-term storage of your best ideas, don’t just blog about a topic, submit it to a journal (for example, JCGT takes practical articles) or article collection book (e.g., GPU Zen series, Ray Tracing Gems) and so have it become accessible for a good long while. It is possible and reasonable to take good blog content and rework it into an article. Going through peer review and copy editing will polish your idea all that much more.

These ramblings reflect my (limited) view of the world. If you know other approaches or resources to combat any aspect of link rot, please do let me know and I’ll fold them in here and credit you. Me, I hate seeing information get lost. Fight entropy now. Oh, and please put a date on any page you put up, so the rest of us can know if the page is recent or ancient history. Blog entries all have dates; everything else should, too.

Update: see my next post for some followups and a bunch of inexpensive options for making your own site.

 

Seven Things for June 27, 2018

First two are the real reason for the post, the third is something I read today, the rest are bits from my twitter feed, in case you don’t hang on my every word there.

  • Jendrik Illner summarizes graphics blog articles in his Graphics Programming weekly. Think of it as your one-stop blog for computer graphics. I wasn’t sure if he’d stick with it, seems like a lot of work to me, but he’s nearing a year’s worth of issues.
  • The free, weekly Level Up Report by Mark DeLoura provides pointers to all sorts of developments and resources for learning through games, coding, and making. Subscribe!
  • Predatory Open Access journals – recent summary from The Economist, with some sad tales. Wikipedia notes some other sting operations, and also gives some counter-criticism.
  • Open source’s use in commercial products is on the rise, with a surprising average of 57% of the code in a proprietary application’s codebase being open.
  • Jamie Wong created a pleasant, profusely illustrated introduction to color science for computer graphics display.
  • I truly start with NVIDIA in August. With my time off, I’ve been occasionally finding time to have fun, with little projects in three.js such as this editable illusion and this local demoparty entry, and my chex_latex script now works on plain text files (yes, after too much time on The Book, I find copy editing fun, or at least a compulsion). Nothing astounding, keep your expectations low.
  • I don’t understand why people keep saying there has never been a mainstream game using a ray tracer. Here’s one from 1997 by Taito on the PlayStation:

Quick Tool to Check Your LaTeX

Executive summary: use the Perl script at https://github.com/erich666/chex_latex

I have been fiddling with this Perl script for a few editions of Real-Time Rendering. It’s handy enough now that I thought I’d put it up in a repository, since it might help others out. There are other LaTeX linters out there, but I’ve found them fussy to set up and use (“just download the babbleTeX distribution, use the GNU C compiler to make the files, be sure to use tippyShell for the command line, and define three paths…”). Frankly, I’ve never been able to get any of them to work – maybe I just haven’t found the right one, and please do point me at any (and make sure the links are not dead).

Anyway, this script runs over 300 tests on your .tex files, returning warnings. I’ve tried to keep it simple and not over-spew (if you would like more spew, use the “-ps” command line options to look for additional stylistic glitches). I haven’t tried to put in every rule under the sun. Most of the tests exist because we ran into the problem in the book. The script is also graphics-friendly, in that common misspellings such as “tesselate” are flagged. It finds awkward phrases and weak writing. For example, you’ll rarely find the word “very” in the new edition of our book, as I took Mark Twain’s advice to heart: “Substitute ‘damn’ every time you’re inclined to write ‘very.’ Your editor will delete it and the writing will be just as it should be.” So the word “very” gets flagged. You could also find a substitute (and that website is also in the comments in the Perl script itself, along with other explanations of the sometimes-terse warnings).

Maybe you love to use “very” – that’s fine, just comment out or delete that rule in the Perl script, it’s trivial to do so. Or put “% chex_latex” as a comment at the end of the line using it, so the warning is no longer flagged. The script is just a text file, nothing to compile. Maybe you delete everything in the script but the one line that finds doubled words such as “the the” or “in in” or similar. In testing the script on five student theses kindly provided by John Owens, I was surprised by how many doubled words were found, along with a bunch of other true errors.

Oh, and even if you do not use this tool at all, consider at least tossing your titles through this website’s tester. It checks that all the words in a title are properly capitalized or lowercase.

A few minutes of additional work with various tools will make your presentation look more professional (and so, more trustworthy), so “just do it”. And, do you see the error in that previous sentence (hint: I wrote it in the U.S.)?

Update: I also added a little Perl script for “batch spell checking,” which for large documents is much more efficient (for me) than most interactive spell checkers. See the bottom of the repo page for details.