I should begin by saying that I don’t personally enjoy the type of video game known as a first-person shooter (FPS) very much. They make me jittery when I play, and I am easily overwhelmed by them. I’m still stuck in the tutorial room with Jacob in Mass Effect 2. If there are settings to disable gore and swearing on a game I’ll click ‘em. But as I looked back on my past posts I realized that I have neglected to weigh in on FPS, and in doing so am guilty of the same kind of dismissal I critique in colleagues. (Note to gamers: I know there are several important distinctions between FPS and TPS or third-person shooters, but that’s for another post.)
There’s a lot to like about FPS games, and here’s a few examples.
- Many FPS such as Halo 2 can be collaborative as well as violent. Players join platoons and need to learn how to coordinate, communicate and problem-solve in a fast-paced environment. Games like Halo also provide environments for players to learn how to assume leadership roles, follow directions from other players, and think critically about stressful in-world situations.
- FPS encourage impulse control as well as aggression. Crucial to success in FPS games is the ability to time attacks and maneuvers. This requires the ability to control the impulse to pull that trigger. Although we tend to focus on the aggression in FPS, there’s often a lot of sneaking going on as well. In Bioshock there are actual decision points in the game where refraining from killing characters changes the entire outcome of the game. Even though the player is not learning teamwork in single-player games, they are often learning the same sorts of forms of decision-making and impulse control in good old-fashioned “Red Light, Green Light.”
- First-person shooters improve hand-eye coordination. One important component of hand-eye coordination is visuospatial attention. Research by Green et al. suggests that video games improve visuospatial attention, and further that FPS video games do it even better than games like Tetris. Hand-eye coordination is a skill most of us would agree is a good thing to have. It helps improve your readiness to learn, increases your ability to excel at sports, increases your confidence and makes juggling less stressful.
- First-Person Shooters may increase a sense of mastery and alertness. So many parents and educators lament how children aren’t able to pay attention. And yet, what makes FPS games so compelling is their immersive quality. As Grimshaw et al. discuss, the literature describes immersion in varying ways, such as ‘the state of consciousness when an immersant’s awareness of physical self is diminished or lost by being surrounded in an engrossing total environment, often artificial’ Further, in order to be completely immersed in an environment, “players ‘must have a non-trivial impact on the environment.” Wandering around the game world may not be sufficient to immerse players into a flow-like state, and shooting people, whatever else you may say about it, does not lend itself to feeling trivial in an environment. Imagine if classrooms could harness the ability to create such immersive qualities in the classroom. Much more effective than saying loudly, “Pay attention!” which usually has the exact opposite effect than the statement intended.
Given the above compelling reasons to think well of FPS, why are they so often singled out as the bad seed of video games? The answer, I would suggest, is a sociopolitical one that gamers as a whole ignore at their peril.
Science is often, maybe always, political, and has an uneasy relationship with civil rights movements. The example that springs to my mind is the LGBTQ civil rights movement. Back when a preponderance of science was pathologizing of all LGBTQ people, there was a more predominant solidarity amongst the various thinkers, activists, and citizens of those subcultures. From Stonewall up through the early AIDS crisis, there was less fragmentation and more coordination, with the understanding that civil rights benefited everyone.
But within the past two decades, many members of the LGBTQ community have begun to receive recognition and acceptance in society as a whole. At this writing 7 states have legalized gay marriage (Welcome Washington!) and more accept domestic partnerships between same-sex couples. Bullying based on sexual orientation and hate crimes have received more coverage from media with sympathetic stances towards LGBTQ youth. And I can’t remember the last time I heard talk about the latest study locating the “gay gene.”
And yet, science and politics have turned their gaze towards specific subsets of the LGBTQ population. Transgender rights (a notable recent gain in my home state) are still ignored or reduced to bathroom conversations and debates about the poor parenting of those who don’t make their children conform to Cisgender norms. The status of LGBTQ youth of color as a priority population is met with grumbling. Bisexuals are still considered in transition or confused, asexuals frigid or repressed. Polyamory is confused with lack of commitment or neurotic ambivalence, and BDSM isn’t even recognized as worthy of any sort of advocacy.
And to a large extent, whenever one of these specific subcultures are targeted, the other factions of the LGBTQ community remain silent. And in doing so, they become allied with the perpetrator. As Judith Herman points out in her seminal work, Trauma And Recovery, “It is very tempting to take the side of the perpetrator. All the perpetrator asks is that the bystander do nothing.” This is exactly what members of the LGBTQ community are doing when they cease to maintain the solidarity and mutual support that helped get homosexuality removed from the DSM-III.
And so the focus shifts from the general “gay people are bad/sick” to the more specific populations also under the LGBTQ umbrella, and rather than fighting for them we allow them to be omitted from civil rights. A case in point was made by openly trans HRC member Donna Rose, when she resigned in protest to HRC supporting an Employment Non-Discrimination Act which included sexual orientation but didn’t include protections for transgender people. A group may only be as strong as its weakest member, but solidarity often ends when the strongest members of an alliance get what they want.
The gaming community would do well to take a lesson here. Recently video games have been getting increasing recognition as an art form, an educational tool, and possible solution to world problems ranging from poverty to AIDS. As society moves to a more progressive stance on technology and video games, studies come under scrutiny for their sweeping and pathological generalizations of a complex and diverse group.
(The most pernicious example of this in my opinion is the concept of the “screen” and “screentime.” Studies ask questions about how much time subjects spend in front of electronic devices, as if all activities were identical in experience and effect. Watching television, playing video games, surfing on Facebook are all treated as similar neurological phenomenon, when they aren’t. It’s much more complicated than that, and different physiological systems are affected in different ways. Even the idea that all screen time dysregulates sleep the same way is being questioned recently, with televisions showing less repression of melatonin than iPads. So what screen you’re doing things on makes a difference. And then there’s what you are doing.
Watching television is a more passive and anergic activiy than playing video games in my experience. No, I’m not going to cite a particular study here, because I want us to focus on thinking critically about the designs of studies not the data. And as Paul Howard Jones points out in his video, learning itself activates different parts of the brain at different phases of the individual’s learning cycle of a particular activity. So yes, video game users have different looking brains than those that aren’t using them, that doesn’t mean it is bad, but that they are using different parts of their brain function and learning different things. Most people in the gaming community would have some solidarity here with other gamers, and balk at the idea that a screen is a screen is a screen. And “screen time” is usually implying screens watching television, playing games or surfing the net, not screens compiling doctoral dissertation lit reviews, planning a vacation, doing your homework, or looking up a recipe.)
So gamers are solidly behind fighting these blanket generalizations. That’s great. But I find that where gamer solidarity is starting to fall apart is around the more specific attacks that are being levied in science and politics around FPS and violent games. Studies says these desensitize children to violence, increase aggression and correlate to hostile personalities. There are also studies that conflict these findings, but I want to ask a different, albeit more provocative question:
What’s wrong with being aggressive?
I think that child’s play has a long history of being aggressive: Cops and Robbers, water pistols, football, wrestling, boxing, tag all encourage some element of aggression. Most of us have played several of these in our lifetime with some regularity, have we become desensitized and aggressive as a result? Am I sounding too hostile? :-)
And we are sending children and adolescents a mixed message if we label aggression as all out bad. Not everyone or every job requires the same amount of aggression. Wanting to be #1 and competing, whether it be in a boxer or a president, requires some aggression. Aggression is in fact a leadership quality. It allows us to take risks, weigh the potential hazards, and go for something. Feelings of aggression heighten our sense acumen, can speed up our assessment of a situation and help us stand up to bullies. Whether we agree with this war or that, would we really want our soldiers to be in-country with no aggression to help them serve and defend? Fortune, as the saying goes, favors the bold, not the timid.
FPS games have a place on the Gamestop shelf and a place in the gaming community. They allow us to engage in virtual activities that have real-life benefits. They are a focal point for millions of gamers, and I believe unlocking their DNA will go a long way to discovering how to improve work and learning environments. Stop letting critics shoot them down, or don’t be surprised if you’re in the crossfires next.
Mike Langlois, LICSW
Latest posts by Mike Langlois, LICSW (see all)
- Using Gaming & Gamification in Clinical Practice - June 25, 2014
- Gamer-Affirmative Practice: Today’s Play Therapy - June 13, 2014
- Bringing Emerging Technology into the Clinical Process: Implications for Engagement and Treatment - June 2, 2014