art with code


Filter Bubbles

Filter bubbles are the latest trend in destroying the world. Let's take a look at how one might construct this WMD of our time and enter a MAD detente with other social media aggregators.

First steps first, what is it that we're actually building here? A filter bubble is a way to organize media items into disjoint sets. A sort of sports team approach to reality: our filter bubble picks these items, the others' filter bubble picks those items. If a media item appears in multiple filter bubbles, it's called MSM (or, mainstream media.)

How would you construct a filter bubble? Take YouTube videos for an example. You might have a recommendation system that chooses which videos to suggest for watching next. Because more binge-watching equals to more ad views on your network, increasing the amount of ad money coming your way, which lets you buy out competition and become the sole survivor. At the same time, time spent on YouTube equals to less time spent on other ad networks, which makes YouTube ads more valuable and further increases the amount of ad money coming your way. So, recommendation system for binge viewing it is!

Suppose you watch a video about the color purple. The recommendation system would flag you as someone who's interested in this kind of stuff (hey, you watched one.) So it'd go and check other people who also watched that video and try to find some commonalities. Maybe 60% of them also checked out a video about the color red. The red video would score high on "watch next". Suppose that 10% of the purple-watchers also gave a thumbs down to a video about the color blue. The recommendation system would avoid showing you the blue video because you might not like it.

So you go and watch the red video. Now, red video viewers might have very strong opinions about blue, and 20% of them vote down blue videos. Some of them also voted down purple videos. The recommendation system would now know not to show you blue videos under any circumstance, and steer away from purple videos as well. On the other hand, the red video viewers quite liked some extremist red videos that dove deep into the esoteric minutiae of the color red. Might be something that pops up on your watch next list, then.

You go and watch an extreme red video. Suppose the extreme red viewers have started disliking more mainstream red videos, not to mention their great dislike for blue and purple videos. Now the recommendation system avoids showing you blue, purple and mainstream red, and populates your watch next list with the purest shade of extreme red.

Welcome to the filter bubble.


The extreme red videos here are an example of an attractor. If you think of the recommendation system as a vector field that guides the viewer in the direction of the recommendation vectors, the extreme red topic would form a sort of a black hole. Topics around it have recommendation vectors that point towards extreme red, but extreme red doesn't have recommendation vectors that point out of it. Once you enter the topics that surround extreme red, there's a high likelihood that you get sucked into it. If you don't get sucked into extreme red, the company would regard that as a failing of their recommendation system and devote time and effort to improve its capability to suck you towards extreme red.

Attractors are special topics. Special in that they make people inside the attractor pull more people into the attractor and prevent their escape. Otherwise they'd be more like regular popular topics: you get drawn into a popular topic, but there's always an escape route towards another popular topic. To make an attractor, the content in the attractor needs to promote behavior that blocks escape from the attractor. For example, an extreme red video that says that mainstream red, purple and blue are all paid shills plotting to destroy the world would call its viewers to vote down other views.

Breaking filter bubbles

If there's an attractor in your recommendation vector field, scramble it. Mark that topic as something where all the watch next links go to far away places or even to places preferentially disliked (i.e. stuff that the group dislikes more than whole population) by the people in the attractor. Reduce the ad payout to content near attractors. Decrease the recommendation weighing of attractor neighborhoods so that escape is more likely.

Create legislation to warn people of viral attractors. Require explicit user consent to apply binge-inducing user engagement systems. Ban binge-inducing products from public spaces and require binge-inducing sites to post warning signs with pictures and cautionary tales of addicts who had their lives ruined by Skinner boxes.

Post a Comment

Blog Archive

About Me

My photo

Built art installations, web sites, graphics libraries, web browsers, mobile apps, desktop apps, media player themes, many nutty prototypes, much bad code, much bad art.

Have freelanced for Verizon, Google, Mozilla, Warner Bros, Sony Pictures, Yahoo!, Microsoft, Valve Software, TDK Electronics.

Ex-Chrome Developer Relations.