Edited a video with After Effects.
Proceeded to follow a motion graphics tutorial.
And then I started making a website. Country life!
art with code
Edited a video with After Effects.
Proceeded to follow a motion graphics tutorial.
And then I started making a website. Country life!
Recently: Writing a web app backed by a bunch of CGI programs written in Haskell, running on Mighttpd2, a web server written in Haskell. And a whole lot of JavaScript to make it run. It's using PostgreSQL to store its stuff.
What the app does: It's a website with a shop component. You can edit the pages by writing HTML into text fields. Yeah. Let's shake our collective heads together.
How's it going: I like writing Haskell though I don't like the Cabal dephell. I probably have to use a different web server as the Mighttpd2 version in Ubuntu 14.04 doesn't do HTTPS and I haven't had luck installing a HTTPS-supporting version from Cabal. Eh. JavaScript is as usual, difficult to keep sane. CSS helps keep the JS less gnarly. I kinda like the little experiments in the page structure. First I had all the site content inlined into a single page, then I moved them out to HTML files that the JS pulls in (and inlined the front page using a build script). Then I moved the HTML files into a database and now I'm fetching all of them in a JSON array at page load, so it's sort of going back to the inline-everything-thing.
How about that shop: The shop part is building shopping carts for PayPal checkout. Which does work, though a completed payment should also ping the server to update the inventory.
How it does what it does: The client does all the interesting bits. The server just serves JSON from the database. The admin client edits the JSON objects it receives from the server and sends them back to save them. The cool bit is that Aeson, a Haskell JSON library, typechecks the whole thing, decoding and encoding the JSON with a minimum amount of code on my part.
-- ListPages.hs -- GET -> JSON -- Returns a JSON array of pages. import Network.CGI import Data.Aeson import PageDatabase import SiteDatabase instance ToJSON Page main = runCGI $ handleErrors cgiMain cgiMain = do setHeader "Content-Type" "application/json" setHeader "Access-Control-Allow-Origin" "*" -- set CORS header to allow calls from anywhere liftIO listJSON >>= outputFPS listJSON = fmap encode $ withDB listPublishedPages -- listJSON :: IO Lazy.ByteString -- listPublishedPages :: Connection -> [Page] -- fetches the list of pages from the database where published = true -- Data.Aeson.encode turns [Page] into a ByteString of JSON. -- See the "instance ToJSON Page" above? -- That is all I need to do to get type-safe JSON encode and decode. -- As long as I use "deriving (Generic)" in my type definition, that is.
Here's the Page data type from the PageDatabase module:
data Page = Page {
page_id :: Int64,
bullet :: String,
body :: String,
published :: Bool
} deriving (Show,Generic)
This is how I deal with the JSON that the client sends:
-- EditPage.hs
-- POST (Page) -> JSON true | false
import Control.Monad
import Network.CGI
import SiteDatabase
import PageDatabase
import SessionDatabase
import GHC.Int
import Data.Aeson
instance FromJSON Page -- Make Data.Aeson.decode parse JSON into Page objects.
main = runCGI $ handleErrors cgiMain
cgiMain = do
body <- getInputFPS "body"
authToken <- getAuthToken -- helper that deals with session cookies, CSRF tokens and login user/pass params
msg <- liftIO (maybe noPage (editPageDB authToken) (body >>= decode)) -- decode turns the body JSON into a Page object
setHeader "Content-Type" "application/json"
output msg
noPage = return "false" -- No body param received or the body param failed to typecheck in decode.
editPageDB authToken page = do
rv <- withDB (\conn -> authenticate conn authToken (editPage conn page)) :: IO Int64 -- authenticate runs editPage if the authToken is OK
case rv of
1 -> return "true" -- Edit successful
_ -> return "false" -- Page not found or auth failed
Isn't CGI kinda slow: Dunno. Testing on an EC2 micro instance, a HTTP request for a JSON array of all the ten pages in the DB takes about 10 ms.
Couldn't you just use Weebly / Squarespace / Wix / Whatever: Hey! Watch it! No! Of course not!
Recently, I wanted to create a small video clip of a WebGL demo of mine. The route I ended up going down was to send the frames by XHR to a small web server that writes the frames to disk. Here's a quick article detailing how you can do the same.
Here's the video I made.
I used plain Node.js for my server. It's got CORS headers set, so you can send requests to it from any port or domain you want to use. Heck, you could even do distributed rendering with all the renderers sending finished frames to the server.
Here's the code for the server. It reads POST requests into a buffer and writes the buffer to a file. Simple stuff.
// Adapted from this _WikiBooks OpenGL Programming Video Capture article_. var port = 3999; var http = require('http'); var fs = require('fs'); http.createServer(function (req, res) { res.writeHead(200, { 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Headers': 'Content-Type, X-Requested-With' }); if (req.method === 'OPTIONS') { // Handle OPTIONS requests to work with JQuery and other libs that cause preflighted CORS requests. res.end(); return; } var idx = req.url.split('/').pop(); var filename = ("0000" + idx).slice(-5)+".png"; var img = new Buffer(''); req.on('data', function(chunk) { img = Buffer.concat([img, chunk]); }); req.on('end', function() { var f = fs.writeFileSync(filename, img); console.log('Wrote ' + filename); res.end(); }); }).listen(port, '127.0.0.1'); console.log('Server running at http://127.0.0.1:'+port+'/');
To run the server, save it to server.js and run it with node server.js.
There are a couple things you need to consider on the WebGL side. First, use a fixed resolution canvas, instead of one that resizes with the browser window. Second, make your timing values fixed as well, instead of using wall clock time. To do this, decide the frame rate for the video (probably 30 FPS) and the duration of the video in seconds. Now you know how much to advance the time with every frame (1/FPS) and how many frames to render (FPS*duration). Third, turn your frame loop into a render & send loop.
To send frames to the server, read in PNG images from the WebGL canvas using toDataURL and send them to the server using XMLHttpRequest. To successfully send the frames to the server, you need to convert them to binary Blobs and send the Blobs instead of the dataURLs. It's pretty simple but can cause an hour of banging your head to the wall (as experience attests). Worry not, I've got it all done in the below snippet, ready to use.
Here's the core of my render & send loop:
var fps = 30; // Frames per second.
var duration = 1; // Video duration in seconds.
// Set the size of your canvas to a fixed value so that all the frames are the same size.
// resize(1024, 768);
var t = 0; // Time in ms.
for (var captureFrame = 0; captureFrame < fps*duration; captureFrame++) {
// Advance time.
t += 1000 / fps;
// Set up your WebGL frame and draw it.
uniform1f(gl, program, 'time', t/1000);
gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
gl.drawArrays(gl.TRIANGLES, 0, 6);
// Send a synchronous request to the server (sync to make this simpler.)
var r = new XMLHttpRequest();
r.open('POST', 'http://localhost:3999/' + captureFrame, false);
var blob = dataURItoBlob(glc.toDataURL());
r.send(blob);
}
// Utility function to convert dataURIs to Blobs.
// Thanks go to SO: http://stackoverflow.com/a/15754051
function dataURItoBlob(dataURI) {
var mimetype = dataURI.split(",")[0].split(':')[1].split(';')[0];
var byteString = atob(dataURI.split(',')[1]);
var u8a = new Uint8Array(byteString.length);
for (var i = 0; i < byteString.length; i++) {
u8a[i] = byteString.charCodeAt(i);
}
return new Blob([u8a.buffer], { type: mimetype });
};
And there we go! All is dandy and rendering frames out to a server is a cinch. Now you're ready to start producing your very own WebGL-powered videos! To turn the frame sequences into video files, either use the sequence as a source in Adobe Media Encoder or use a command-line tool like ffmpeg or avconv: avconv -r 30 -i %05d.png -y output.webm.
To sum up, you now have a simple solution for capturing video from a WebGL app. In our solution, the browser renders out frames and sends them over to a server that writes the frames to disk. Finally, you use a media encoder application to turn the frame sequence into a video file. Easy as pie! Thanks for reading, hope you have a great time making your very own WebGL videos!
After posting this on Twitter, I got a bunch of links to other great libs / approaches to do WebGL / Canvas frame capture. Here's a quick recap of them: RenderFlies by @BlurSpline - it's a similar approach to the one above but also calls ffmpeg from inside the server to do video encoding. Then there's this snippet by Steven Wittens - instead of requiring a server for writing out the frames, it uses the FileSystem API and writes them to disk straight from the browser.
And finally, CCapture.js by Jaume Sánchez Elias. CCapture.js hooks up to requestAnimationFrame, Date.now and setTimeout to make fixed timing signals for a WebGL / Canvas animation then captures the frames from the canvas and gives you a nice WebM video file when you're done. And it all happens in the browser, which is awesome! No need to fiddle around with a server.
Thanks for the links and keep 'em coming!