The A/B Tested Reality

John Cline
4 min readJan 16, 2017
Best Housekeeping

I was running the other day on the treadmill at my gym, which has a lovely view of the street with the shop above. As my mind wandered in the boredom that is the treadmill, I started to wonder how this sign came to be. More importantly, how the sign appears to have remained the same for the last twenty years despite advances in marketing and design.

The first thought I had was why the GE logos? Is GE still well known for appliances? I haven’t ever bought an appliance, so I’m not sure whether LG or Honeywell or Samsung or whoever is doing a great job right now. Would GE attract the most customers? There’s two prominent placements, and a lot of real estate on this sign — could it be used for something better?

Then I thought about how they would A/B test it. If we wanted to try out an LG logo instead, per se, how would that work? You could put up some LG logos for a few weeks and see if sales improve. That wouldn’t account for just a general uptick over that time period unrelated to the sign change, however.

You could change the sign and then survey customers — did you come in because of the large LG logos? But that’s probably not the reason most customers would answer. They’d likely say they just need a new washer or refrigerator.

So what if we wanted to run a true A/B test, where we can control for all other factors except for the sign change? How could we alter reality such that a roughly evenly distributed population saw either the current GE sign or the new LG logo?

Fortunately, the future is here.

(horrible mixed reality promo shot)

Mixed (or augmented) reality is starting to get to a point where this sort of thing is a real possibility. My train of thought logically went to something along the lines of “well, we can’t just manually flip the sign for a bunch of people on the street without influencing the test, but if it was digital or a display ad we could…which we could do if enough people were wearing mixed reality devices.”

I’m fairly confident that at some point in the nearer-than-you-think future, the majority of people will wear some sort of mixed reality device almost 100% of the time. If the utility and ubiquity of the smartphone has shown us anything, it’s that humans are information addicts. As we make the experience of consuming information more seamless by making it more actionable and contextual, mixed reality will become the default way we connect.

When we are all using mixed reality devices, testing like this becomes not much more difficult than Google testing 41 shades of blue. Whichever company ends up coming out with devices will likely include some form of support for advertising, and optimizing ads (and real life conversion) is a natural follow on feature. In this case, Best Housekeeping can see which logo best drives foot traffic (or purchasing), along with potentially evaluating the entire sign design. Now, whether testing this kind of change and measuring conversion on an infrequent and large purchase like an appliance would make a difference is an open debate, but it would technically be possible.

Much better, right?

This opens up a whole host of other ethical issues though. Humans already live in a subjective reality — what happens when our own interpretation is now based on a an objectively different experience of reality? When two people looking at the exact same thing see something different, and both are factually correct?

Minority Report and Black Mirror have both somewhat addressed this issue. Minority Report approached it more from the perspective of what happens when you are universally recognized and marketed to, and Black Mirror from several perspectives crossing mixed and virtual reality but most poignantly with Men Against Fire.

Reality, as it usually does, will likely fall somewhere between these interpretations. This is an interesting thought exercise for now, but it won’t be long before what something saw on their mixed reality device becomes central to a court case. We as technologists must address the ethical implications of what we create before one of our algorithms accidentally ends up swaying the outcome of an election.