art with code

2011-04-24

Runfield & Remixed Reality

Oh right, I did these demos for Mozilla a couple months back:


Runfield


Runfield is a Canabalt clone with painted graphics (I made a guide on how to do your own graphics for it, but it's a bit "First you sketch a good-looking picture and then you paint it! Done!"). The graphics were painted in MyPaint and GIMP. The renderer is done with Canvas 2D and uses drawImage to draw thin vertical slices from the background images to make up the undulating ground.

The main things I wanted to communicate with Runfield were speed and polish. Showing that you can make a fast 2D game with JS and have it look good. Accordingly, most of the dev time was spent making the graphics and optimizing the engine (:

Optimization tips: draw images aligned to the pixel grid, eliminate overdraw (if you know that a part of an image is not going to show, don't draw that part), use the first couple seconds to detect the framerate and drop down to a lighter version if the framerate is low.


Remixing Reality


Remixing Reality is another demo to showcase what you can do with JavaScript today. It's processing video frames in real-time to locate AR markers and uses WebGL to draw 3D models on top of the markers. If you click the play button on the side, music starts playing and there's a 3D music visualizer powered by BeatDetektor2 and the Mozilla audio API, again analyzing the audio in real-time.

The AR library powering the thing is JSARToolKit, a pure-JS port of the Flash FLARToolKit (which uses NyARToolKitAS3, which is a port of the Java NyARToolKit, which is a port of the C ARToolKit. Whew.) Porting it over to JS was pretty quick, since the AS3 syntax is close enough to JS syntax that I could write a good-enough syntax translation script in a couple days. Then I implemented the AS3 class semantics in JS and off we go.

Well, it wasn't quite that easy. The syntax translator is a hack and I had to go and manually fix things. And implement the pertinent parts of Flash's BitmapData. And write a shim to make it work with Canvas. But hey, 14 kloc port in a week!

The job didn't end there though. It was slow. The biggest slowdown was that the library was reading a couple pixels at a time from the canvas, and each of those reads called getImageData. So, cache it, problem solved.

It was still a bit slow, mostly due to FLARToolKit using BitmapData's color bbox queries to do feature detection. I.e. find the smallest rectangle in the bitmap that includes all pixels of a certain color. Each call to BitmapData#getColorBoundsRect needs to go through the pixels in the bitmap and find the first row where the wanted color is found, then the bottom row, then scan the rows in between to find the left-most and right-most columns. This process was not too fast in JS.

But NyARToolKit, the library which FLARToolKit is based on, was doing the feature detection in an entirely different way. Its algo was running on RLE-compressed images (Run-Length Encoding: pack data as [value][number of repetitions], e.g. aaabbbb becomes a3b4). And since the images in question are thresholded to black and white, RLE works very well. Expected result: smaller images => less work for JS => faster.

So... I made the JS version use the NyARToolKit version. And hey, it was 5x faster! Nice!

Another thing that helped performance on Firefox 4 was using typed arrays instead of normal JS arrays. Fx4's JIT generates more efficient machine code for typed arrays. On Chrome 10(? IIRC), typed arrays and normal arrays didn't have much of a performance difference, but the code ran fast enough on normal arrays already.

For the 3D stuff I used my Magi library. With a Blender export script to get the models in. And a slightly tweaked lighting shader to make it fill the unlit areas with some ambient. Fun times.

2011-04-01

Browser rendering loop

The browser rendering loop is how the browser displays the web page to you.

The main stages of the rendering loop are:
  1. Updating the DOM.
  2. Rendering the individual elements.
  3. Compositing the rendered elements to the browser window.
  4. Displaying the browser window to the user.

The DOM updates happen in JavaScript or in CSS transitions and animations. They include things like "Hey, I'd like that header to turn red." and "Please draw a thick line on the canvas."

If you're drawing to a canvas, you might expect the browser to draw as soon as you issue a drawing command. Which is what actually happened in earlier browser versions. But in the latest browsers, it doesn't quite work that way. Nowadays the browser queues up the drawing commands and only starts drawing when it absolutely needs to. Which is usually just before compositing, in the second stage of the rendering loop.

However, if you want to force the browser to finish drawing before continuing JS execution, you can try doing getImageData on the 2D Canvas and readPixels in WebGL. As they need to return the finished image, they should force the browser to flush its draw queue. This comes in handy if you ever need to figure out the time it took for the browser to execute your drawing commands.

Once all the individual elements are drawn, the browser composites them together to create the final browser window image. And finally, the browser window image is shown to the user via the OS window manager.

The frame rate perceived by the user is the frequency at which step 4 is repeated. In other words, how often the updated browser window is shown to the user.

As most flat panel displays can only update 60 times per second (the TV frequency), the browser tries to display only up to 60 frames per second. Going over 60 when your display can't take advantage of it would only burn more CPU and reduce battery life, so it makes sense to clamp the update frequency to the display's update frequency.

Optimally, the browser would finish all its drawing before doing a new composite, but current browser implementations have slight problems with that. So it's really rather difficult to figure out the actual framerate visible to the user. If you have a high-speed video camera, you could record the display and see how fast it's updating.

But if you want to do it all in the browser, you could try something like this. First, hook up to the frame loop with requestAnimationFrame. Second, flush the drawing queue when your frame is done. Third, measure time from flush to flush. Hopefully browsers will move towards requestAnimationFrame only being called after flushing the previous frame.

Firefox and Chrome actually make this a bit easier for you by providing some built-in framerate instrumentation. Chrome dev channel has an about:flags FPS counter that (sadly) only works for accelerated content. Firefox 4 has the window.mozPaintCount property that keeps track of how many times the browser window has been redrawn.

References:
GPU Accelerated Compositing in Chrome
Hardware Acceleration in the latest Firefox 4 beta
ROC: Measuring FPS
Measuring HTML5 Browser FPS, or, You're Not Measuring What You Think You're Measuring

Please send me a note if anything above is wrong / misguided / an affront to your values and I'll try and fix it.

Blog Archive