Plugged the TarGZ loader together with my old OBJ and binary model loaders and emerged with a couple demos that load a model and a texture from a gzipped tarball and render it using WebGL.
Verdict: use uncompressed tarballs and HTTP gzip compression whenever possible. Doing the gzip with JS doesn't cache the uncompressed data transparently (you'd have to do an offline storage hack), and it's kinda slow. Parsing tar vs. plain files is a bit different tradeoff. With tar you only have to manage a single file. With plain files, you need to manage the whole directory hierarchy they live in. Which may not be something that you can or want to do with your file serving host.
One solution is uncompressed tarballs and a .htaccess with deflate filter set for them, though it seems to screw up caching somehow (thanks for the .htaccess tip go to Evgeny Demidov)
which also helps for JSON models and OBJs and other textual things. The main problem with the tarballs is that loading the images takes a bit longer (read: hundredths / tenths of a second) due to having to dataURLize them. Well. If you pre-dataURL the images, that won't be a problem.
The main benefit of the tarballs is that everything's in a single file and you don't need a chain loader to initialize models in a single step. Instead of first showing an untextured model, then sometime later the model with a diffuse texture, then later the model with a diffuse texture and a normal map, etc., with single-step initialization you get the full textured model once all the data is loaded. Plus, if you have a standard model format of some sort with manifests and whatnot, you can swap models by changing a single URL. You can do that with a plain model.xml or somesuch as well, but then you need to manage multiple files. It's harder to reuse resources inside tarballs than resources as separate files, though.
Also updated the big whale model viewer to be mouse-controllable (drag with LMB to rotate, drag with MMB to pan, scroll wheel to zoom, shift-scroll wheel to scale).