art with code


Raspberry Pi RC car

Here's the software package to turn your Raspberry Pi into a RC car https://github.com/kig/rpi-car-control


Use a Raspberry Pi to drive an RC car from a web page.

How does it work?

Open up a cheap RC toy car. Connect the motors to a Raspberry Pi. Add a camera. Run a web server on the Raspberry Pi that controls the car.

In more detail, you need to replace the car PCB with a motor controller board (say, a tiny cheap MX1508 module). Then solder the motors and the car battery pack to the motor controller. Solder M-F jumper cables to the motor controller's control connectors. Plug the other end of the jumpers to the Raspberry Pi GPIOs. Now you can control the motors from the Raspberry Pi.

Expose the Raspberry Pi camera as an MJPEG stream so that you can directly view it as an IMG on the browser. This is the easiest low latency, low CPU, high quality streaming format.

If the car has lights, you can drive them from the GPIOs as well (either directly or via a proper LED controller). Add a bunch of sensors to the car for the heck of it. I've got a tiny VL53L1X ToF laser-ranging sensor as a reversing radar, and a DHT temperature and humidity sensor. There's code in the repo to hook up an ultrasonic range finder too (it can even use the DHT sensor to calculate the speed of sound for given temperature and humidity - and has a Kalman filter of sorts, so you can reach ~mm accuracy), and some bits and bops for using a PIR sensor.

There was also a microphone input and playback either through wired speakers or to a Bluetooth speaker, but that's not enabled at the moment. There was also a WebRTC-based streaming solution for doing 2-way video calls, but that was such a pain I gave up on it. I was using RWS which is pretty easy to set up, but the STUN/TURN stuff was tough.

Add a USB battery pack to power the Raspberry Pi and you're about done. If you're feeling adventurous, you could use a 5V step-up/step-down regulator to run the Raspberry Pi directly from the car batteries.


raspi-config # Enable I2C to use the VL53L1X sensor
sh install.sh

The install script installs the car service and its dependencies. This is best done on a fresh install of Raspbian. The install script overwrites NGINX's default site configuration.

After starting the car control app with sudo systemctl start car, you can connect to http://raspberrypi/car/ and play with the controls web page.

The car control app is installed in /opt/rpi-car-control.

To use a SSH tunnel server, edit /etc/rpi-car-control/env.sh and change the line RPROXY_SERVER= to RPROXY_SERVER=my.server.

With the SSH tunnel, you can access the car from http://my.server:9999/car/. Best to firewall this port and add a HTTPS reverse proxy that points to it. Look at etc/remote_nginx.conf for a snippet that sets up an authenticated NGINX reverse proxy on the remote server. (Run htpasswd -c /etc/nginx/car_htpasswd my_username to create the password file.)


See /etc/rpi-car-control/env.sh for settings.

# SSH tunnel reverse proxy

# One of v4l2-mjpeg, v4l2-raw, raspivid

# Which camera to use in the v4l2 modes

# Video settings



Controls HUD

The circle on the left is the accelerator indicator, and the circle on the right is the steering indicator. The bar in the bottom middle is the reversing distance indicator. The sensor data readout is at top left. The little square at the bottom right toggles the full screen mode.

The controls are defined near the bottom of html/main.js.

Touch controls

  • Use left thumb to accelerate and reverse, right thumb to steer.

Keyboard controls

  • Use arrow keys to drive.
  • The numbers 1-4 control front lights intensity and 0 turns the rear lights on and off.
  • The z key blinks the left front light, the c key blinks the right front light and the x key turns off the blinkers.


The app is very modular, so you can run the app without an actual car or camera. And just play with a web page with controls that do nothing.

If you wire up the motors, you should be able to drive. If you wire up the lights, they should light up.

Wire up the sensors and you should start seeing sensor data in the HUD.

Add a camera and you'll see a live video stream.


See control/car.py and sensors/sensors_websocket.py for the pin definitions. The VCC and GND connections have been left out. Just remember to use the correct voltage when wiring those.

Motor forward (A)17
Motor backward (B)27
Steering left (A)24
Steering right (B)23
Left headlight5The headlights turn on when you connect
Right headlight6They can also blink a turning signal
Rear lights13Rear lights light up when you reverse
Power PWM12Disabled, for use with L298N
DHT11 signal14
PIR signal22
VL53L1X power4Use a GPIO and you can turn it off when not in use
VL53L1X SDA2I2C bus 1
VL53L1X SCL3I2C bus 1


  • FPV stream web page with keyboard & touch controls to drive the car, along with a reversing distance indicator and a thermometer.
  • Low latency video stream for driving (down to 50 ms glass-to-glass when using a 90 Hz camera and a 240 Hz display.)
  • Bunch of websocket servers to send out sensor data and receive car controls.
  • Nginx reverse proxy config to tie all the servers together.
  • Systemd service to start the car control server on boot.
  • SSH tunnel to a remote control server to drive the car from anywhere.
  • Low-power tweaks to increase battery life (disables HDMI, Ethernet and USB.)
  • Use RaspiCam or a V4L2 USB webcam, either with raw video (eats CPU) or camera-supplied MJPEG


  • Bluetooth speaker pairing for playing audio.
  • Stream car microphone to the browser.
  • Speak to the car from the browser by sending audio with Web Audio API.
  • WebRTC call between browser and car.

In progress

  • PoseNet with Coral USB accelerator for "point and I'll drive there"


  • OMX JPEG encoder for raw video cameras
  • SLAM and "click on a map position to drive there"
  • Good small microphone + speaker solution
  • Small display to do two-way video calls
  • Non-sucky camera mount (duct tape doesn't really work)
  • Power car and computer from one battery
  • Automatic wireless charging when battery is low
  • Shutdown when battery critical
  • Speech controls


Take a look at run.sh first. It starts the web server and optionally the reverse proxy tunnel. The web server is in web/web_server.py and starts up bin/start_control_server.sh and bin/start_server.sh when needed. The sensors are controlled by sensors/sensors_websocket.py, and the car controls are in control/car_websockets.py. For video streaming, have a look at video/start_stream.sh. The HUD is in html/, see html/main.js for the car controls and how the video and sensor data are streamed.



Ilmari Heikkinen © 2020



Found this short story of mine from 2003.


Bright sunlight shone down through a sieve of elm leaves. The air was rich with the joyous melodies of songbirds. Gradually intensifying. Until the racket was nearly deafening.

I woke up. The alarm I had set last night wasn't quite as gentle as I had hoped, but it did get its job done. After getting my head straight with a sip of very hot, very strong coffee, I switched the overlay mode to read the few bits of mail that got through my extensive spam filter. Nothing much, just a few more entries into the filter. Some interesting linguistic randomizer used to compose two pieces of junk mail, completely different in detail, but identical in content. Spent the next five minutes training the language analyzer to filter out such messages. Thinking about implementing an opt-in filter to keep out all unsolicited mail. Coffee finished, starting on a croissant with two slices of cheese and a salad leaf. Reading the newsbits on overlay, not much going on locally. Not much going on globally. Twenty-seven new posts on the community forums overnight. Most of it ranting, logging and post-flood. No more croissant.

Teeth clean. Beard shaved. Donning my well-worn semi long jacket with a big decorative camtex pattern of a round microcircuit flower on the back. Setting the default texture of the flower to quicksilver today. Playing Strauss on the audio overlay, walking the stairs down to street level in beat. ID-system sending auth to the door. Out of the door and onto the street. Toning down brightness and contrast of the urban landscape, selecting an 18th century Italian architecture style for the building camtex overlay. Walking to the grocery store, lazily scanning the shared data of passersby, grabbing a spectator video of yesterday's CTF semifinals. Checking out some interesting plays, when I hear the audio overlay collision alert. Sidestepping to avoid bumping into a man dressed in synthleather jacket with a blazing camtex phoenix pattern. Fiddling around with overlay texture controls, turning people into 1x1x2m blocks of sandstone with dust raising in their wake and public ID carved onto all sides in a blocky rune font. A sandstone column with a green target marker suddenly appears from around a corner. Turning off the sandstone texture of people with markers.

Nostro is wearing her blueish-gray long wool coat. No camtex in sight. We talk and rummage through each other's shared files while on the way to the grocery shop. I show her my sandstone block mod and she giggles at me being a horrible person. Stepping through the shop doors, greeting the shopkeeper, picking up some oranges and pizza, heading back out through the door. Approving the instant debit payment dialog appearing on the overlay. Nostro is sipping from the apple juice carton she bought. We part ways. I head to the park. She goes home.

Chatting on the 'net, dodging lollerskaters, blasting old-skool Prodigy on the overlay. The park has a little stream running through it, roughly ten meters from bank to bank. Some ducks and an occasional swan swimming in it, fed fat by people throwing bread crumbs at them. I sat on a park bench in the shade of a cherry tree. Switched overlay to work mode, started going through work mail, attached the work network screen and checked my todo-list. Changed text input method from vocal chord sensors to finger joint virtual keyboard. Started with editing out done jobs from the todo-list, proceeded with coding some overlay mods. Mostly visual fx, with some audioscape hints and ideas for the sound guys. Spent the afternoon debugging a furry cherry tree mod. Detached the work screen and threw the now empty bag of oranges into a recycler. Took the pizza home, put it in the freezer, hoping it didn't go bad from working with me. Attached the work screen to put some final touches into the cherry tree. Added a 20% encounter rate blackbird into the scene. Set myself down onto the bed.

Cruelly awakened an hour later by the familiar jingle of a game request. Switched overlay to complete external sensor block and dove into the mechanical battlearmor. Thirty minutes and several deaths later, the opposing team succeeded in destroying our last remaining respawn point. Soon after, two enemy armors emerged from behind a hill and blew my right leg joint with a well-aimed AP round. I managed to take out the other enemy armor with some creative mortar juggling, but the other managed to hit my immobile armor with some plasma rounds near the vulnerable ammo deposits. A big boom later I was but another spectator following the desperate struggle of our last remaining armor against a full enemy five armor squad. Five seconds, and the only remainder of our last armor was a big charred hole in the ground. I switched external sensors back on and talked about the game with the guy on my contact list who woke me up.

I put the pizza into the oven, set the overlay to alert me in 12 minutes. Checked the videofeeds. Nothing really interesting going on on commercial live webcasts. The girl two floors down in apartment A12 was setting up her weekly piano performance webcast. She's been steadily improving during the two months I've been living here. Good performance tonight, wired her a two credit micropayment as my thanks for tonight. The pizza hadn't gone bad. Not that it was some miraculous culinary marvel either.

Checked the newsbits. Scriptkiddies DDOSsing people's personal uplinks on the streets to mug them when they can't call the cops via the net. Discussion forums had a link to a patch that sets up an emergency outbound link on the id data channel. Too bad that unauthorized communication on the id data channel is a major villainy. Today's spam had a lot of personal security products. Tasers and such. Easy to fry someone's optics with one. Nasty. Updated the filter list.

Approved the home server patchlist for applying. Approved the personal unit patchlist for applying. Spent 5 millicredits worth of gridtime to compile the patches. Took a shower, set the camtex of the bathroom to a jungle waterfall. Almost hit my head to the wall. Body clean. Teeth clean. Set the overlay to wake me up in eight hours.


Ilmari Heikkinen
Last updated: 2003-02-03


The true meaning of Brexit

The true meaning of Brexit. It's right there. In your heart. It's been there all along. Brexit isn't about how many presents you receive, it isn't the fancy party, it isn't even the joyful songs you sing. Brexit is deeper than that. Brexit is the love you feel for this island and the people on it.

Brexit isn't about fighting with your neighbours and throwing your friends out into the cold. Brexit is about making a better Britain. A Britain where you can be a mechanic, a nurse, a mathematician or an art historian. A Britain where you don't have to be a banker to make ends meet. That's the real meaning of Brexit.

Brexit isn't about tearing apart our relationships and breaking our contracts. Brexit is your love for Britain, the love that will create a new Britain. A better Britain. A Britain for everyone. Article 50 isn't Brexit. Article 50 is a sword that's cutting this island apart. The first step to true Brexit is revoking article 50. Only then can we start building a better Britain.

A Brexit Britain. A Britain that will be a shining beacon of light in a world fallen to darkness. A Britain that stands proud above the waves. A Britain that doesn't require our neighbours to fund its poorest regions. A Britain that can stand alone for a thousand years, but chooses to stand together with its neighbours and create a world where our children and grandchildren can live. Proud of their ancestors. Proud of you and me.

That's the true meaning of Brexit. Revoke Article 50.


Voxel grid shortcuts

This might be nice and fast if it worked right. Voxel grid shortcuts: precompute closest full cells for cell-cell-pairs, look up ray entry & exit cells, jump ray to closest cell. On exit, jump ray out of the model. 🤔

But does it make sense to swap 4 steps through a 512 element 8x8x8 acceleration structure for some math & a lookup from a 262k element shortcut list 🤔

If you do only the external faces, 64^3 grid => 6^2x64^2 accel, which might be worthwhile.

The problem in the above screenshots is that a voxel-to-voxel beam intersects more voxels than a ray would. Now it's generating the shortcuts by tracing a ray from the center the start voxel to the center of the end voxel to find the closest filled voxel the ray intersects. Which doesn't visit all the voxels a beam would, so you get gaps in the model. And my code is broken in other ways, eh.


Fix the atmosphere for profit

1) Build enough solar / wind to run your country.
2) Double your build to cover for low production periods.
3) Use the surplus to run Fischer-Tropsch process to convert atmospheric carbon to fuel.
4) Keep building solar / wind until synfuel production exceeds your demand.
5) Export the excess synfuel, use proceeds to build more solar / wind.
6) Once synfuel production exceeds global demand, start stockpiling the synfuel.
7) Keep going until atmospheric carbon hits normal levels.
8) Control global synfuel supply.

The economics of oil: different extraction technologies have a different price-per-barrel. If oil goes below that price, there's no profit in doing the extraction and the oil fields get shuttered to wait for higher prices.

If you can produce synfuel at a cost below an oil field, the oil field gets shut down. Solar has been halving in price every five years, so you might well imagine an inflection point where synfuel mined from the atmosphere with solar power is cheaper than extracted oil. At that point, the synfuel company can start accumulating excess profits and squeezing traditional producers out of the market, while protecting its monopoly by acquiring nascent competition and what traditional producers and oilfields it can.

Atmospheric mining is a zero-sum game: there's a limited amount of carbon dioxide in the atmosphere and you have to stop mining when the CO2 levels fall too low. By mining out all usable atmospheric carbon, the company can eliminate any chance of competition. To add carbon to the atmosphere, the company sells synfuel to users who burn it. The company can then mine the carbon back from the atmosphere using solar.

Because the company earlier used its excess profits to acquire unprofitable oil extractors and oil fields, it also has a source of extra carbon ready to go. As global fuel use increases, more and more carbon needs to be circulated through the atmosphere.

Similar to mining, surface solar is also a zero-sum game. To produce solar, you need land area. Once the land area is in use, it can't be used for more solar. Acquiring the best lands for solar use will make competition difficult, especially at scale.

The end state of the atmospheric carbon mining company is a monopoly over hydrocarbons and solar energy. At the end state, there are no oil reserves left and oil is a just-in-time produced synthetic product. How fast you can burn the oil depends on the speed of the atmospheric mining process. This could reach a point where an amount of oil matching the total amount drilled over the past century is circulated through the atmosphere each year. To prevent losing carbon to oceans, there also needs to be a system to extract dissolved carbon from seawater.


Beam acceleration

Noodling with beam-based acceleration. Tessellate bounding volume to N faces, connect them with N^2 beams, add primitives to beams, sort primitives inside beam, find containing beam for ray, traverse beam primitives from ray origin in ray direction.

Accelerate in-beam intersection by finding a set of primitives that completely cover the beam. Cut the beam at the rear-most primitive, tessellate the cutting plane, create new set of beams from beam entry to the cutting plane faces.

Bad: eats memory like crazy. Good: should be possible to do 2 beam classifies + 1-2 triangle intersects per ray on ~100ktri scenes. Let's see how it goes. (Also, see Fast Ray Tracing by Ray Classification (1987) by Arvo & Kirk)

Crazy = ~GB / 100ktris...


Hardware hacking

I got into these Pi devices recently. At first it was a simple "I want an easy way to control sites accessible on my office WiFi to stop wasting time when I should be working", so I set up an old broken laptop to prototype a simple service to do that. Then I replaced the laptop with a small Orange Pi hacker board. And got some wires and switches and breadboard and LEDs and resistors and ... hey, is that a Raspberry Pi 3B+? I'll get that too, maybe I can use it for something else...


I took apart a cheap RC car. Bought a soldering station to desolder the wires from the motor control board. Then got a motor controller board (an L298N, a big old chip for controlling 9-35V motors with 2A current draw -- not a good match for 3V 2A Tamiya FA-130 motors in the RC car), wired the motors to it, and the control inputs to the Raspberry Pi GPIOs.

Add some WebSockets and an Intel RealSense camera I had in a desk drawer and hey FPV web controlled car that sees in the dark with a funky structured light IR pattern. (The Intel camera is .. not really the right thing for this, it's more of a mini Kinect and outputs only 480x270 video over the RPi's USB 2 bus. And apparently Z16 depth data as well, but I haven't managed to read that out.) Getting the video streaming low-latency enough to use for driving was a big hassle.

Experimented with powering the car from the usual 5 AA batteries, then the same USB power bank that's powering the RPi (welcome to current spike crash land, the motors can suck up to 2A when stalling), and a separate USB power bank ("Hmm, it's a bit sluggish." The steering motor has two 5.6 ohm resistors wired to it and the L298N has a voltage drop of almost 1.5V at 5V, giving me about 1W of steering power with the USB. The original controller uses a tiny MX1508, which has a voltage drop something like 0.05V. Coupled with the 7.5V battery pack, the car originally steers at 5W. So, yeah, 5x loss in snappiness. Swap motor controller for a MX1508 and replace the resistors with 2.7R or 2.2R? Or keep the L298N and use 1.2R resistors.) Then went back to the 5 AA batteries. Screw it, got some NiMHs.

Tip: Don't mix up GND and +7.5V in the L298N. It doesn't work and gets very hot after a few seconds. Thankfully that didn't destroy the RPi. Nor did plugging the L298N +5V and GND to RPi +5V and GND -- you're supposed to use a jumper to bridge the +12V and GND pins on the L298N, then plug just the GND to the RPi GND (at least that's my latest understanding). I.. might be wrong on the RPi GND part, the hypothesis is that having shared ground for the L298N and the RPi gives a ground reference for the motor control pins coming from the RPi.

Tip 2: Don't wipe off the solder from the tip of the iron, then leave it at 350C for a minute. It'll turn black. The black stuff is oxides. Oxides don't conduct heat well and solder doesn't stick to it. Wipe / buff it off, then re-tin the tip of the iron. The tin should stick to the iron and form a protective layer around it.

Destroyed the power switch of the car. A big power bank in a metal shell, sliding around in a crash, crush. It was used to control the circuit of the AA battery pack. Replaced it with a heavy-duty AC switch of doom.

Cut a USB charging cable in half to splice the power wires into the motor controller. Hey, it works! Hey, it'd be nicer if it was 10 cm longer.

Cut a CAT6 in half and spliced the ends to two RJ45 wall sockets. Plugged the other two into a router. He he he, in-socket firewall.

Got a cheapo digital multimeter. Feel so EE.

Thinking of surface mount components. Like, how to build with them without the pain of soldering and PCB production. Would be neat to build the circuit on the surface of the car seats.

4-color printer with conductive, insulating, N, and P inks. And a scanner to align successive layers.

The kid really likes buttons that turn on LEDs. Should add those to the car.

Hey, the GPIO lib has stuff for I2C and SPI. Hey, there are these super-cheap ESP32 / ESP8266 WiFi boards look neat. Hey, cool, a tiny laser ToF rangefinder.

Man, the IoT rabbit hole runs deep.

(Since writing the initial part, I swapped the L298N for a MX1508 motor controller, and the D415 for a small Raspberry Pi camera module. And got a bunch of sensors, including an ultrasound rangefinder and the tiny laser rangefinder.)

Blog Archive