Why submit when you can blog?

I was cleaning up the RTR portal page today. Of all the links on this page, I often use those linked in the first three items. I used to have about 30 blogs listed. Trying them all today, 5 have disappeared forever (being replaced by junk like this and this), and 10 more are essentially dead (no postings in more than a year). Understandable: blogs usually don’t live that long. One survey gives 126 days for the average lifetime of a typical blog. Another notes even the top 100 blogs last an average of less than 3 years.

Seeing good blogs disappear forever is sad for me. If I’m desperate, I can try finding them using the Wayback Machine, but sometimes will find only bits and pieces, if that. This goes for websites, too. If I see some article I like, I try to save a copy locally. Even then, such pages are hard to find later – I’m not that organized. Other people are entirely out of luck, of course.

My takeaway: feel free to start a blog, sure. But if you have some useful programming technique you’ve explained, and you want people to know about it for some time to come, then also submit it to a journal. One blog I mentioned last post, Morten Mikkelsen’s, shows one way to mix the two: he shows new results and experiments on his blog, and submits solid ideas to a journal. I of course strongly suggest the (new, yet old) Journal of Computer Graphics Techniques (JCGT), the spiritual successor to the journal of graphics tools (as noted earlier, all the editors have left the old journal). Papers on concise, practical techniques and ideas are what it’s for, just the sorts of thing I see on many graphics blogs. Now that is journal is able to quickly publish ideas, I dearly want to see more short, Graphics Gems-like papers. If and when you decide to quit blogging/get hit by an asteroid/have a kid, if prior to this you also submitted your work to a journal and had it accepted, you then have something permanent to show for it all, something that others can benefit from years later. It’s not that hard, honestly – just do it. JCGT prides itself on working with authors to help polish their work and bring out the best, but there are plenty of other venues, ranging from SIGGRAPH talks, Gamasutra articles, and GPU Pro submissions to full-blown ACM TOG papers.

Oh, I should also note that JCGT is fine with work that is not necessarily new, but fills a gap in the literature, explains an improved way of performing an algorithm, gives implementation advice, etc. Citing sources is important – don’t claim work that isn’t your own – but otherwise the goal is simple: present techniques useful for computer graphics programmers.

By the way, if you do run a website of any sort, here are my three top pet peeves, so please don’t do them:

  • Moving page locations and leaving no forwarding page at the old page’s location (I’m looking at you, NVIDIA and AMD) – you don’t care if someone directs traffic to your site?
  • Giving no contact email address or other feedback mechanism on your web pages – you don’t want to know when something’s broken?
  • Giving no “last updated on” date on your web pages – you don’t want people to know how fresh the info is?

15 thoughts on “Why submit when you can blog?

  1. ingenious

    Thanks, Eric, for the guidelines! I’m all for publishing to JCGT mainly for two reasons – it’s peer reviewed and it serves as an archive. Your work will never get lost, and it weighs more.

    I’m personally seriously considering submitting a couple of things to JCGT, and I hope I’ll have time for this. Until then, I have two questions:

    1) Does JCGT replace JGT? Or is both journals are planned to co-exist, what’s the difference between the two?

    2) Can you please add an RSS feed to the news/announcements section on jcgt.org?

    Thanks,
    Iliyan

  2. aboeing

    Interesting post, but you don’t seem to address the title of your post. Personally, I’ve come from the reverse situation. I used to write papers, but found that they have far less impact than a blog post (i.e. many more people read my blog than my papers). Second, writing a paper, submission, review, etc. is far more work than putting together a blog post, or a decent webpage. In addition, paper submission is usually accompanied by a submission cost, and a costly and time consuming conference trip. Finally, I’m also no longer an academic, so there is no external incentive to publish.

    Paper publications are a thing of the past, blog posts and webpages are a better way to disseminate knowledge now, and for the foreseeable future.

    Please don’t interpret this as a negative comment on all your hard work on JCGT, I find it to be an extremely valuable resource.

  3. ingenious

    1) High content and writing quality, ensured by the peer-reviewed process
    2) Easy to keep track of and archiving interesting stuff

    These for me are the primary reasons to prefer journal papers over blog posts. The journal approach indeed has the drawbacks that aboeing pointed out. And I think the people behind JCGT realize that, and want to have something in the middle, which ideally takes the advantages of both approaches. This is why I think JCGT is a good idea, and why I support it.

  4. Eric Post author

    I admit to a bit of bait and switching with the title of my blog post; I was tempted to go with the converse, “Why blog when you can submit?” but the answers seemed obvious to me: with blogging you get immediacy, both in posting and feedback, along with total control. The downsides of blogging are what I wanted to point out: the short half-life of blogs, and the fact that they can just plain disappear. I was inspired to post this article mainly because I had been looking for some information that I knew was on the meshula.net blog, only to find that the blog had disappeared and the domain now goes to some garbage site.

    I see JCGT as a filter for readers, and a reality-check for authors. There are plenty of blogs to follow, the hard part is knowing what’s worth reading – see “Sturgeon’s Law”. Even good technical bloggers misstep, sometimes talking to themselves without realizing they’ve lost their audience. I know I’ve benefited from reviews with helpful feedback, e.g. “this part of the algorithm is unclear to me, especially step 4; perhaps a figure would help?” Maybe that’s how JCGT should be marketed: “You’ve put a nice new idea up on your blog. Do you want some feedback to improve both testing and presentation? Would you then like your result widely disseminated, archived, and made permanently available, if judged worthwhile by a neutral group of reviewers? Would you like this all done for free, at no cost to you, and you retain all rights to your work? Then send your work for review to JCGT.” Yes, it’s a bit more work to get the blog entry ready for submission, but our hope is that the benefits outweigh this cost. Not blogging at all is even less work, but I assume your motivation as a blogger is the same as mine, to get good information to interested people at no cost.

    On the question of whether JCGT replaces JGT, as far as the original JGT editorial board and I are concerned, JCGT replaces JGT. We asked the publishing house Taylor & Francis (who currently own JGT) for the JGT name and code archives; they declined, as (reasonably enough, from a profit standpoint) their goal as a publisher is to make money from their journals. So, confusingly, JGT lives on, though with none of the original editorial board. Taylor & Francis are currently looking for new editors in order to keep JGT alive.

  5. sebastien lagarde

    Hi,
    I am 200% with JCGT but I wanted to share my though about blog.

    Blog provide statistics, which is something I found really important.
    Blog allow up to date information. It is anoying when a paper contain mistake and you need to go somewhere on the web to find the correction (when you know there is coprrection), because the paper can’t be updated.

    My ideal way of publish article will be a wikipedia for articles. I mean articles that can be updated even years later, than other authors could update with agreement, with history and statistics, with peer reviewing.

    Just random idea 🙂

  6. Larry Gritz

    What mechanisms are being put into place to ensure that JCGT (or any online journal; what does PLOS do?) doesn’t have all the shortcomings of blogs? What if it runs out of money, loses its volunteers, is sued by somebody who claims prior ownership of “JCGT.com”, or runs afoul of the deities in any number of other ways? At the end of the day, SOMEBODY is footing the bill for a server to host web pages at a particular address. If the server and/or the dollars go away, so do the papers.

    This was not a problem for print journals; the publishing company could fold, the editors could die, people could stop submitting, but all the prior issues would still be physically preserved in thousands of university libraries. An important paper would more or less always be available (at worst findable with some work, travel, or interlibrary loan), and in a medium that preserved its historically accurate form.

  7. Eric Post author

    Sebastien: statistics are an interesting point (and something an online journal could provide), though I’m not sure of their use beyond, “it’s nice to see a bunch of people have visited my page.” Since readers can’t easily compare such statistics for different blog entries from different authors, there’s no ability for them to find the important or popular offerings. Even if available, for a blog popular doesn’t mean useful: an advantage of a journal is that you know none of the articles will be vacation photos or pictures of the new baby.

    Corrections are definitely something I’ve thought about for journal articles. My feeling is that corrections should be put on the “homepage” for the article, the page which contains the abstract and links to related resources for the paper (source code, video, etc.). I’m of two minds when it comes to actually correcting the paper itself. I guess this would be fine overall, if the paper is clearly shown as “version 1.1” on the first page, and all corrections are explicitly listed in the paper at its end (so that it’s self-contained and a reader could see what was fixed). If a paper became popular, but during the course of its life had 5 different versions, it could be a bit confusing. It would certainly be important to note which revision you’re referencing. My main concern is avoiding overburdening the managing editor of the journal, who would have to reissue the PDF when errata was reported.

    Larry: yes, this is a definite concern, whether anything electronic just disappears if people owning the site lose interest/go to jail/get divorced/etc. If you think about it, this is true for *all* internet resources, Wikipedia included. I’ve been thinking that about my own little projects, and I suspect you’d done the same: “what if a piano falls on my head today?” The Graphics Gems repository and the Ray Tracing News would probably live on, as they’re hosted at acm.org, but they would certainly never get updated. Naty and Tomas could cover resources hosted on realtimerendering.com, but that’s minimal support. I’m actually surprised there aren’t companies (that I’ve heard of) that deal with this sort of thing. It’s certainly a problem in general: http://www.economist.com/node/21553410. The laws in this area are a muddle, too, e.g. http://www.economist.com/node/21553011

    I guess my default answer is “The Wayback Machine”. It’s surprisingly thorough for some sites: Graphics Gems, the Ray Tracing News, and this site are all archived there, for example, including the downloadable code distribution (i.e., not just the pages). I personally don’t like the “only one backup” feel of this, though – what if the Wayback Machine goes away? (And how it hasn’t been sued out of existence yet by some idiot, I don’t know.) Perhaps the answer is to allow mirroring of the entire JCGT site by anyone who wants to do so – this is worth considering. What do you think? It’s also entirely possible to offer the journal as a printed entity, for example http://store.kagi.com/cgi-bin/store.cgi?storeID=6CZKD_LIVE&lang=en offers the open access journal JMLR as print copies. But, given tight library budgets, I don’t foresee libraries making such archival copies. Hmmm, maybe it’s good that Elsevier forces libraries to buy bundled subscriptions, so that the more obscure journals are available… The field of computer graphics has had to deal with this problem for decades, if you think about it. Some useful and important papers have been “published” in course notes, which are then very hard to find years later. I think of Jim Arvo’s “Backwards Ray Tracing” paper, which is luckily still at his website, http://www.ics.uci.edu/~arvo/papers.html (in postscript – that’s a different problem), even though he’s passed on.

    By the way, a problem with the old JGT is that, while the articles themselves are tucked away in libraries, the code and related resources (much of the reason for the journal) are not. Taylor & Francis, the current owner of JGT, restored the old publisher’s “homepages” for the articles, but have not given readers a way to actually reach these. This I consider a danger of having a publisher: if the imprint changes hands, the new owners may not maintain non-print resources (or even offer back issues). For JGT, there are two ways to get to the old pages with code. One entry point is here: http://jgt.akpeters.com/issues/, but this link may not last. The Wayback Machine also has them: http://web.archive.org/web/20110707101838/http://jgt.akpeters.com/

  8. sebastien lagarde

    For the point about statistics interest, I found them motivating to continue to write because not everyone give feedback, even to say they enjoy or not to read an article. And knowing which of your topic are popular can help to know topic of interest.

    For the article update point, nothing to had, but just some real example to show (which match what you say):

    – Taking the sample of “lean mapping” paper from Marc Olano, it is as you describe:
    http://www.csee.umbc.edu/~olano/papers/lean/
    Error are reported on an homepage with other links. It is ok as long as you check the homepage.

    – Taking the sample of the “Frequency Domain Normal Map Filtering” paper form Han and al.
    The web page report no errata
    http://graphics.berkeley.edu/papers/Han-FDN-2007-07/index.html
    However, Morten Mikkelsen found error in this paper, have contact Han to confirm and suggest correction in his thesis
    http://image.diku.dk/projects/media/morten.mikkelsen.08.pdf
    Sadly original paper or the website, don’t mention the error, nor any link to Morten Mikkelsen work.

    – Last case, update of paper like “Stupid spherical harmonics trickz” from Peter Pike Sloan
    http://www.ppsloan.org/publications/StupidSH36.pdf
    I like this way because this paper have a version number, the history can be found at the end of the paper and only the last corrected paper is available, so no way to miss the update.

    I am agree that there is no all-in-one solutions or better solutions. Reediting paper is difficult, reprinting paper is impossible. This is why I like blog.
    Maybe JCGT should have a homepage by article with a list of update link and correction ?

    Just my two cents

  9. Eric Post author

    Oh, and I forgot to mention one mechanism that I personally think is worthwhile for archiving JCGT: the “collected works” book. As an example, JGT made an “Editors’ Choice” book, http://www.amazon.com/Graphics-Tools-The-Editors-Choice/dp/1568812469, articles we thought were particularly worthwhile. There are certainly options for print on demand, or making a book of one or two years’ worth of material, etc. It’s an interesting question whether we’d need to use a traditional publisher (just so we’d get an ISBN number, etc.) or can roll our own print-on-demand version. A traditional publisher does have the advantage of marketing to libraries.

  10. Eric Post author

    As far as “will the journal’s contents disappear?”, Morgan McGuire, the editor-in-chief, replies:

    “The archival storage that I’m setting up is backed indefinitely by the endowment and a Mellon grant. According to a big sign outside of the library the college is a Library of Congress official archive (we have the original drafts of the declaration of independence, the US constitution, all congressional archives, etc.) and this new initiative is the digital extension of the physical storage. So I bet the Williams archiving plan is at least as good as the ACM’s… and the college has been successfully keeping records since 1753, so their track record is pretty good.”

  11. Eric Enderton

    According to the game industry panel at I3D 2012: Game developers disseminate their technical ideas by blogs, Twitter, and beer. A blog post is quick, and it’s easy to include source code. Using Web3D it can even be live running source code. Review and selection is by who tweets your blog post. I also heard that what game developers care about most when they see a technical article is, “How quickly can I get the code running, so I can kick the tires?”

    This is foreign to me. I would rather read JCGT and benefit from the work of editors. But maybe I’m just behind the times. Actually if each tweet was two sentences and a link, instead of just a link, I’d be sold: one sentence to summarize the idea, one sentence to give a review. (Like Halliwell’s Film Guide.)

  12. Steve Worley

    It’s important to preserve the information on (useful) blogs, but similarly, preserving useful forums is also important. I give the painful example of ompf.org, which hosted the definitive realtime raytracing community for years. It suddenly disappeared in November 2011. There are some partial archives, mostly from 2009, but much if not most of the great technical info and discussion is gone.
    What’s the solution to that problem? Part of it may be to simply not trust small hosts, but that’s not always possible.

    Perhaps one useful method which could help (but not solve) the disappearance problem for blogs, forums and journals would be for each to host a current archive of itself. For example, JCGT could include all of its papers and code into a big (perhaps very big) archive file, downloadable at any time (by BitTorrent if size becomes an issue), and updated say quarterly. The loss of ompf.org would have been minor if any other site could restore a read-only mirror of all the ompf threads from such a periodic archive. Even a giant raw text dump of the ex-forum would still blunt most of the pain.

  13. Eric Post author

    Eric E.: there’s one blog I follow that mostly fits your requirements, https://twitter.com/#!/morgan3d, Morgan McGuire’s (by coincidence, the editor-in-chief of JCGT). His feed is purely cool links that have to do with graphics or games, and why they’re cool. I haven’t seen any others with this high a signal to noise ratio.

    Yes, ompf.org is a tragedy, so very sad – even the Wayback Machine’s last save was only January 2010. For mirroring, I had the same idea for JCGT, that there should just be a single link to a “download everything” RAR archive of the whole site (ZIP archives can be a maximum of “only” 4 GB in size, so RAR’s the way to go). The main danger is old mirrors getting mistaken for “the real thing”. I had this happen with the Ray Tracing News (which has a single “here’s the website” zip), of people saying “where’s the new issue?” when the issue was on the main site but not the particular mirror.

  14. Pingback: In review – My 1st year (and a bit) of blogging | dickyjim

Comments are closed.