The term millennial refers to the generation following mine, Generation X, who were born between the early 80s and 2001. There certainly may be some differences in the millennial cohort in terms of race and social class, but in my experience working in both urban and suburban settings, technology use is not one of them. In fact, technology has probably exacerbated some of the traits millennials are known and often criticized for. Social media has made expression more democratic and amplified, and millennials cite self-expression as extremely important. Growing up with the internet has also placed them in the same social and informational spheres as their parents more than previous generations, making them more civic-minded than rebellious, and having different, some would say overly dependent, attachments to their parents.
Common complaints about millennials include that they are entitled, tethered to their parents, unable to tolerate longterm goals, averse to sustained effort and require a constant stream of praise for the most minimal pieces of work. The other side of this coin is worth noting, too: Higher sense of self-expression has led to millennials’ higher acceptance of diversity in others; they are more comfortable with switching jobs or organizations they work with and working outside the box in general. Yes, they may also have a higher tendency to blame external rather than internal things for their problems, but having come to self-awareness post-9/11, can we really blame them?
In my work, I often encounter children, adolescents and young adults who are failing in school for a variety of reasons. These “millennials” avoid attending, and often the blame is placed on excessive video game use. They are seen to be escaping from reality, and although I can understand this perspective, it also puzzles me in some ways. Video games would in many ways seem to me to be going from the frying pan into the fire: They are rife with failure; in fact the statistic Jane McGonigal gives us is that people are failing 85% of the time in playing video games. MMOs often require even more collaboration, sharing and critical thinking between individuals than classrooms in any given 30 minute period.
Millennials are often criticized as post-academic workers as well, for having less job loyalty, a need for constant feedback, and expecting that feedback to be praise. In more affluent school districts I often heard their parents described as helicopter parents, who would email school minutes after receiving the report card to begin to debate the grades and exert pressure on educators to change them. This has led to such grade inflation in my experience that my graduate school students are hurt and insulted when they get a B+ on a paper, sometimes to the point of tears. I can’t remember a class I had in college where I wasn’t listening to a lecture, millennials are constantly asking for more small group work. I’ve even had a call on occasion from a parent about their child’s performance. Did I mention that I taught in graduate school?
From the above criticisms you’d think I was down on millennials, and you’d be dead wrong. Because I think for the most part the millennials are happy, tolerant, and more likely to help others voluntarily than other generations, and the Pew Research on them bears this out. And I think that a major reason for this is that they play video games.
The video games of today and the past decade have morphed from Pong and Space Invaders to Halo and World of Warcraft. They have set up myriad game worlds where survival and thriving requires critical thinking, social collaboration, and lots of trial and error for mastery. These games have also been played by over 90% of the millennial population, and I would suggest that the result is that millennials have been conditioned to be more collaborative, expect feedback to be quick and positive, and be more connected to others through technology.
Then we send them to school, and it is frustrating for a majority of them, a majority of teachers, and a majority of parents. Rather than encourage them to be “lifelong learners,” education as it is currently structured aims to produce a very narrow form of educated person, one that Sir Ken Robinson describes in his TED talk as an “academic professor.” In addition, we all start to become impatient with millennials to adopt our own often individualistic notions of what adulthood is. They need to stand on their own two feet, work without constant reassurance, and memorize things that they could just as easily Google. All to get into the right college, and all to get a good job.
We criticize the millennials’ work ethic for many of the same reasons: They won’t take individual responsibility for projects, they have trouble working independently, and they expect an award merely for being present. They need to take things more seriously and get their nose to the grindstone, no one has time to hold their hand anymore. These are all complaints I have heard levied against adolescents and young adults in my work, and the implicit message is that it is time to grow up.
One of the greatest things we can learn from millennials is something that I think they learned from video games, and that is how to destigmatize, and even enjoy, failure. The epitome of this for me is the ‘Heron’ The Greatest Spelling Bee Fail/Epic Win of All Time’—which was posted on YouTube originally by the millennial who flubbed it. This ability to have a sense of observing ego and humor about oneself is something many of us in psychotherapy work with our patients for years to achieve, and yet as a generation millennials seem to have grasped it more easily.
Part of my work with gamers is often to explore this paradox: Why is it fun or okay to fail in video games so much, and so intolerable in work or school? Sure, part of it is that play is a magic circle according to Huizinga, which is marked apart from real life. But games impact the same brain, the same emotions that exists inside and out of that circle. And if that is the case, there must be some transferable skills. We work on how to destigmatize failure
Innovation requires lots of trial and error, and lots of failures. As educator Lucas Gillispie said at a recent education conference in Second Life, it makes little sense to penalize so harshly when students get 69%. Rather than see it as having acquired more than half the knowledge assessed, we make it a source of embarrassment and usually require they repeat the entire exercise, class, or grade. Millennials have grown up with a split view of failure. On the one hand, video games have helped them understand that failure can be fun, even if you’re failing 85% of the time. On the other, they are put in educational environments where the A is everything, and the goal of learning is to get high marks rather than enjoy the creativity and critical thinking. In fact, A’s are so limiting! Why not focus on a high score which can always be improved upon in school? If the best you can do is an A, then you have to resort to accumulating the most A’s possible, which is less intrinsically rewarding and dynamic.
Many detractors will say that millennials need to get with the existing program, that what I am suggesting is dumbing down a curriculum, or that I am being too Pollyanna and that some jobs just aren’t capable of being fun. But for over a decade companies like IBM have found success modeling work environments on MMOs, and schools which institute dance classes notice higher math scores. And the solution to our economic and occupational troubles may not be the return of a “work ethic” or more job, but the creation of new types of schools and jobs, work we can’t even imagine yet because it hasn’t been innovated or invented. Can you imagine some 14th century youth telling his farmer dad, “I don’t want to work on the farm. I’d like to create and use something that applies pressure and ink to paper to make reading and writing something we can all do.”
It probably isn’t a coincidence that the word “epic” has become ubiquitous over the past several years, with so many millennials and others playing video games like World of Warcraft. And it has come under fire by many of my colleagues, who maintain that in a culture where everything is Epic, nothing truly is. I’m not saying that everything is Epic, but I am saying that there can be some Epic every day. It’s what they call teachable moments, flow, success, even the Epic fail that we can laugh off with colleagues before redoubling our efforts to nail it next time. What we’re really learning here is how to tolerate frustration.
Millennials know that “epic” is a superlative, they’re not devaluing the currency of that word. If anything, I think that this is a sign that Buddhist thinking is becoming more integrated into the 21st century: It is Epic that we are here alive in this moment, that we want and fear so much, and the struggles that ensure from those things. There are a lot of levels left to unlock and problems to be vanquished in the world, and we need to cultivate optimism and positive psychology at school and in the workplace, not stomp on it.
Millennials often have that sense that there can be some Epic every day. Video games offer worlds where there can be some Epic every day, too. Let’s start noticing it.
Mike Langlois, LICSW
Latest posts by Mike Langlois, LICSW (see all)
- Using Gaming & Gamification in Clinical Practice - June 25, 2014
- Gamer-Affirmative Practice: Today’s Play Therapy - June 13, 2014
- Bringing Emerging Technology into the Clinical Process: Implications for Engagement and Treatment - June 2, 2014