What if I told you that there was an internet product that you used on a regular basis that watched what you did every time you used it and would actually change its appearance based on your preferences. Your friends often used the product, too, and it knew how you interacted with them in this online setting. And in the end, this product in question would manipulate your moods, sometimes swaying you towards different feelings or pushing you towards particular decisions.
Then I would tell you that you’re playing a videogame.
There’s no need to belabor it here, but you probably heard something about Facebook’s “emotional contagion” study that was quietly announced and then loudly protested last week. The Atlantic has the full rundown, but essentially Facebook data researchers conducted an experiment in real-time with more than 700,000 participants (likely unknowing) and managed to manipulate their moods based on what Facebook showed them. For many, this indiscretion was the latest in a long series of breaches of trust.
But if you’ve been playing games on the internet, then Facebook’s experiment shouldn’t be that surprising. In fact, whatever game you’re playing right now is likely studying you.
Mike Ambinder, formerly of Valve and now Oculus Rift, has been a proponent of biofeedback for game designers, saying the field will help create a more accurate picture of who players are based on how they sweat and how their heart beats. It’s only a matter of time before technology like this becomes more commonplace in the home. Nintendo experimented with a “vitality sensor” and the Microsoft Kinect can actually track your heart’s rhythms.
At Microsoft’s User Research Lab, for example, Eric Schuh, senior user researcher at Studios User Research, leads teams to figure out exactly what’s not working in a game like Halo. They use a wide variety of in-person techniques like play testing and watching players through two-way mirrors. But the dream is to go outside of what they’re able to do with the thousands of people that pass through their research facilities’ doors.
“I’m really interested in looking at stuff outside of our labs,” Schuh said in 2012. “We do do some research outside of our labs, but I am really interested in what happens when you’ve got a smart device in your hand that has photographic and voice capture capabilities, and how you can combine that with the type of research that we do, to understand better how people are using technology and games.”
And that dream is now a reality. At last year’s App Developer’s Conference, Suhail Doshi, founder of mobile and web analytics practice Mixpanel, extolled the virtues of rigorous data collection to help mobile game makers create more enticing (read: emotionally captivating) experiences. Mixpanel analyzes more than 6.2 billion actions a month for over 500 customers like AirBnb. By parsing users’ actions, Mixpanel helps mobile game makers make smarter decisions about keeping users around.
Then there’s Chethan Ramachadran, CEO of SF-based company Playnomics, who’s dissecting games’ player-by-player actions in real-time to unlock psychological profiles to better understand our motivations, and, of course, to enable the creation of custom game experiences. “The segmentation of the audience can be nearly infinite,” he told us. “It can be behavioral, predictive, it can tie in the nitty-gritty game events, it can tie in to personality. It paints a really rich profile of how people play.”
So if you think that the Facebook experiment was bad, games are already creating the environments to test on us while we play. As Ninja Metrics CEO and USC sociologist Dmitri Williams has said, games can be the perfect “petri dish” for social experimentation and can help us understand the underlying motivations for what make us tick.
Of course, Facebook is your “real life,” in a sense, and games are a separate arena of living. And as Laurie Penny writes in the New Statesman, “There are no precedents for what Facebook is doing here. […] Facebook itself is the precedent. The ethics of this situation have yet to be unpacked.”
But remember that what frightens and frustrates people the most about Facebook’s experiment isn’t the fact that they were testing. Companies like Facebook test different elements of their product everyday. Google has hundreds of variations of the its homepage. Investor Chris Dixon expressed legitimate confusion:
If you a/b test to make more money that’s fine, but if you a/b test to advance science that’s bad? I don’t get it.
— Chris Dixon (@cdixon) June 29, 2014
Intention is obviously very important. But game makers, like Facebook, conduct experiments (because all user tests are experiments) ultimately to make more money. The expressed purpose is “usability” but, if we’re honest, it’s about making money, which in and of itself, is not a bad thing.
What people are actually concerned about is that Facebook was changing people’s emotions without informed consent—or, worse, they were unwittingly pushing unstable people over the edge. But for those who play games, we have our emotions toyed with all the time. They raise our blood pressure, activate our brain chemistry, and so on. All games manipulate our emotions. It’s part of what makes games tick.
Two5six ’13 speaker and Riot Games social systems designer Jeff Lin, for example, has been conducting “priming” experiments in an attempt to reduce negative behavior. Simply changing the colors of the text influenced toxic behavior. Blue was soothing; red riled people up. Zynga has used similar tests to discern which colors are more likely to generate more interest.
I asked Dmitri Williams what he thought of the Facebook experiment. Williams was one of the first to take live player data from EverQuest to research a variety of social systems, including economics. “My first reaction was jealousy, ” he said. “I understand the ethics of this space. But the ability to do what they can do and test social networking theories at massive scale. It’s a great tool that can advance human sociology in a quantum way.”
But players are willing to trust Riot in a way that they don’t trust Facebook, even if the activities are exactly the same. “Their relationship between business and customers is different,” Williams says. “Riot has a player-centric culture.” So perhaps the outrage is less a reflection of the process and more a response to Facebook’s perceived encroachment over “private” areas of our lives.
This may be, but that doesn’t deflect the reality that games affect us in ways that can ultimately be exploited. As University of Wisconsin mathematics professor Jordan Ellenberg wrote,”From the point of view of social science, making millions of people’s lives imperceptibly sadder is an ethical red flag. But for a company whose survival depends on turning ad views into money, maybe it’s just business.” Let’s hope that those behind the wheel have our best interests at heart, and not merely their own.
Flock image via Karen