Works, Life and Marshmallows: Iterative Design

marshmallow

They say the journey of a thousand miles begins with a single marshmallow.  Ok, I say that, and more specifically I am talking about your life, job or relationship rather than a journey.  I am coming back to my practice from a brief sabbatical, and have been noticing that while many things are going to stay the same, a few are changing as well.  I’ll get back to the marshmallow in a minute.

One thing I learned on my sabbatical is that I definitely want to continue my therapy practice.  As I said to some friends on Facebook this week, “You know, I’m kind of grateful that I get to challenge the self-hatred of others for a living.”  As a clinical social worker and psychotherapist I get paid to do that.  One thing I also decided on my leave was to withdraw from the last managed care insurance panel I was on.  It made no sense to continue to decrease the time I could be seeing people due to paperwork and bureaucratic hassles, and it made no financial sense to have a waiting list of people who are willing to pay my full fee and also deserve treatment just so I could work at half my rate.  I have always built pro bono or sliding scale slots into my practice because I have a commitment to serving a diverse population, so why was I doing that and letting an insurance company slide the remaining hours of my week?

Part of the answer to this and most “why-have-I-been-doing-this-this-way-when-it-doesn’t-work-in-my-favor?” questions is fear. Most of us are afraid of change.  Whether we are staying in an abusive relationship, having difficulty getting sober, flunking out of college or missing days at work, most of us have moments when we see what we are doing to ourselves and ask the above question.  And then we often resume whatever the pattern is, leaving an interesting question unanswered and instead turning it into self-recrimination, which is really just evasion.  Another part of the answer is that we often act is if we only get one shot at answering the question of life satisfaction.  Here comes the marshmallow.

Invented by Peter Skillman of Palm, Inc. and popularized by Tom Wujec of Autodesk, the Marshmallow Challenge may be familiar to some of you:  “It involves the task of constructing the highest possible free-standing structure with a marshmallow on top. The structure must be completed within 18-minutes using only 20 sticks of spaghetti, one yard of tape, and one yard of string.” (per Wikipedia)  You can call it an exercise, or play, but in either event the creators of the challenge have observed something very interesting about how different groups tend to approach it.  Children tend to make a first structure, stick the marshmallow on top, and then repeat the process over and over, refining it as they go.  Adults tend to engage in group discussions, arguments, power plays and plans to produce one structure built once to which the marshmallow is added.  In other words they tend to approach it derivatively rather than iteratively.

Iterative design is a method of creating a thing or addressing a problem by making a prototype (first attempt,) testing it, analyzing the prototype, and then refining it.  Rinse and repeat.  Iterative design isn’t good for everything: As parents know, often there is not time in the world for everything to get done in 18 minutes or before the school bus gets here.  But a life built on derivative design alone is destined for stagnation and rigidity.

Derivative design, as the name suggests, takes something from a pre-existing something-else, whether it be a rule, materials, social construction or interpretation of the something-else.  When you psychoanalyze a patient’s dream and interpret it as a manifestation of their Oedipus Complex, you are deriving your interpretation and their dream from the something-else of Freud, who in turn derived his Oedipal Conflict theory from the something-else of Greek mythology.  Derivative design can save time and effort in many important ways, by collapsing cultural memes and thinking and transmitting them forward through time from Sophocles to your office.  But as feminist thinkers and cultural critics have shown us, we might have arrived at a different “complex” if Audre Lord et al had been in on the prototyping of it.

Derivative thinking left unchecked can get you in a rut.  One of my most recent examples of this comes from The Little Prince, where he encounters the drunkard:
“- Why are you drinking? – the little prince asked.
– In order to forget – replied the drunkard.
– To forget what? – enquired the little prince, who was already feeling sorry for him.
– To forget that I am ashamed – the drunkard confessed, hanging his head.
– Ashamed of what? – asked the little prince who wanted to help him.
– Ashamed of drinking! – concluded the drunkard, withdrawing into total silence.
And the little prince went away, puzzled.
‘Grown-ups really are very, very odd’, he said to himself as he continued his journey.”

Everything derives from the previous thing, but in the end it sometimes gets us nowhere.

We all get in these difficult spirals.  A good therapist or supervisor can point them out to us and then encourage us to become iterative in our design:

  1. So what are you going to do this time?
  2. How did that work out?
  3. So what are you going to do differently?

Therapists starting their private practices also come to see me, often stuck in derivative thinking:

-I need my NPI number.
-Ok, why?
-To get on Medicare.
-Ok, because?
-So I can get on insurance panels.
-Ok, why?
-So I can get patients who will pay me so I can rent an office so I can have an address to register for my NPI.

If you are one of my consultees reading this rest assured I am NOT talking about you in particular:  I have had this conversation a hundred times with people.  We get indoctrinated into the world of managed care and get, well, managed.  In this case, I usually recommend the consultee start by imagining what kind of office space they want.  Answers have varied and included: Sunny, exposed beams, plants, yellow paint, toys, music system, waiting room with receptionist, friendly colleagues in suite, accessible to public transportation, elevator, warm colors, cool colors, and all sorts of other iterations.

Once you have a mental prototype you can either build or design your office, or find and rent it.  Again I tell folks to walk around the areas they want to work in, find buildings that look interesting to them, then walk inside and ask to speak with someone about seeing a unit.  Testing involves going to see several spaces.  Then they can analyze the results: Does the space look like it would become what they imagine it to be furnished? Are there things about their ideal that need to be discarded? Do they now realize that they could be even more wild in their expectations?

This is just one example of the ways that iterative design can open up possibilities.  But be warned, iterative design can be daunting for many of us raised in our current education system.  We have been trained to create one product presented in final form with the expectation that we will be graded on that product alone. Everything becomes about that one paper or exam, which is often more about regurgitation rather than innovation.

I have colleagues who take my breath away with the number of projects and ideas they are consistently throwing out there to see what happens:  It takes guts to do that.  I myself often am afraid that the Project Police are going to pop out and say, “What happened to your idea of a Minecraft group?  Shame on you for proposing it and not completing that project!  You are not allowed any more ideas until you show us you can carry that one out.”

Sound ridiculous? Of course it is, but does it sound familiar to you as well?  If it does, go out and buy yourself some spaghetti, tape and marshmallows:  The quality of your job, relationship and life may depend on it.

Interested in setting up a consult for your practice?  I have some openings come March.  Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

No Matter How You Feel, You Still Failed

Game_Over

Psychotherapists are often people who prefer to deal with feelings in their workings with people.  Feelings are important, and being empathically attuned to how patients are feeling is equally important.  We are taught to explore the patient’s feelings, imagine ourselves into their lived experience, and validate that experience.

This is often where we become disconnected from other professionals we collaborate with, such as educators.  Be it Pre-K or graduate school, educators are charged with working with students to learn and grow as a whole person.  It’s not that they aren’t concerned with feelings, they just can’t get hung up on them to the exclusion of everything else.

To be fair, psychotherapy has a long history of taking a broader view on the individual as well.  A famous psychoanalyst, Winnicott, once responded to a patient of his who was expressing feelings of hopelessness by saying something to the effect of “sometimes when I am sitting with you I feel hopeless too, but I’m not going to let that get in the way of continuing to work with you.”

But often in the past decade or two, feelings have held sway over everything.  Students don’t complete their assignments because they felt overwhelmed and still expect to pass the course.  Adults feel emotionally exhausted and miss work or are late to it.  Children feel angry at the injustice of chores and don’t do them but still want their allowance.

A criticism I often hear toward video games is that they encourage people to believe that they can always just reset, do over and have another shot.  But implicit in this criticism is the fact of something I feel video games actually do better than many of us sometimes:  They acknowledge the reality of failure.

When we play video games, we are failing 80% of the time.  Failing in the sense of Merriam Webster’s definitions including:

  • to not succeed : to end without success
  • to not do (something that you should do or are expected to do)
  • to fall short <failed in his duty>
  • to be or become absent or inadequate
  • to be unsuccessful

In video games the reality of this is driven home to us by a screenshot:

minecraft71

 

 

warcraft

 

 

pac man

 

You can feel any way you’d like about it, angry, sad, annoyed, blase, frustrated with a touch of determination.  But no matter how you feel you still failed.

In life outside games, many of us have a hard time accepting the reality principle when it comes to failing at something.  We think we can talk, think, or feel our way out of failing to meet expectations.  My own predilection is that of a thinker, which is probably why I became a psychodynamic psychotherapist and educator.  I often waste a lot of time trying to think (or argue) myself into a new reality, which just boils down to not accepting the reality principle.  I notice the same with patients, colleagues and students, who miss deadlines, avoid work, come late to class and then try their best to think or feel their way out of it.

The first class each semester I tell my students, who are studying to be social workers and psychotherapists, that the most frequent complaint I get as an instructor is “I feel put on the spot by him.”  I assure them that this is a valid feeling and actually reflects the reality that I will put each and every one of them on the spot.  I will ask them tough questions, I will point out that they are coming late to class, I will disagree with ideas that seem erroneous to me.  Because if they think it is ok to be late or avoid thinking through a problem or confrontation in class, how in the world will they ever be a decent psychotherapist or social worker?  If the single mother you are working with wants to know how to apply for WIC, and you say you feel put on the spot by her question, that is a valid feeling AND you are useless to her.  If your therapist was 15 minutes late every week I hope you’d fire him.  And when you are conducting a family session and someone discloses abuse it is unprofessional to say “I’m feeling overwhelmed and sad right now, can you ask somebody else to go next?”

These sort of disconnects doesn’t happen overnight.  It comes from years of being enabled by well-intentioned parents and yes, mental health providers who focus on feelings to the exclusion of cognition and behavior, and worse, try to ensure that their children grow to adulthood feeling a constant sense of success.  When I hear self psychology-oriented folks talk it is almost always about mirroring and idealizing, and never about optimal frustration.  And I suspect that this is because we have become so focused on feelings and success that we are preventing people from experiencing optimal frustration at all.

The novelist John Hersey has said “Learning starts with failure; the first failure is the beginning of education.”  We commence to learn because reality has shown us that we lack knowledge or understanding.  That’s the good news.  We’ve woken up!  In this light I regard video games as one of the most consistent learning tools available to us.  When that fail happens and that screen goes up you can try to persuade it to cut you some slack, flatter or bully it, weep pleadingly for it to change to a win, but no matter how you feel, you still failed.  And because that reality is so starkly there, and because the XBox or PS3/4 doesn’t get engaged in your drama, that feeling will eventually dissipate and you will either try again, or give up.

Because that is in a lot of ways the conflict we’re trying to avoid isn’t it?  We want to avoid looking reality square in the face and taking responsibility for what comes next.  We want to keep the feelings flowing, the drama going, and we are willing to take entire groups of people and systems with us.  If we are lucky they put their feet down, but more often then not they want to avoid conflict too, and the problem just continues.

So here’s a confession:  I have failed at things.  I have ended a task without success.  I have not done things I was expected to do.  I have fallen short, been inadequate and been unsuccessful at stuff.  And nobody took away my birthday.  I’m still around doing other things, often iterations of the previous failures, quite successfully.

If you are a parent or educator please take a lesson from video games.  Start saying “Game Over” to those in your care sometimes.  If they can try again great.  If they want to read up on some strategy guides or videos to learn how to do it better, awesome.  But please stop capitulating to their desire to escape reality on the illusory lifeboats of emotional expression, rationalization or verbal arguments.  As Mrs. Smeal says in “Benny and Joon,” “when a boat runs ashore, the sea has spoken.”  Reality testing is probably the most important ego function you can help someone develop, please don’t avoid opportunities to do so.

Nobody likes to experience failure, I know it feels awful.  But to move through it to new realizations can be very liberating, and in time become more easily bearable.  And I truly believe that success without past failures feels pretty hollow.  When I play through a video game from start to finish without a fail I don’t feel like a winner.  I feel cheated.

 

Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

Selfie Esteem

Nancy J. Smyth, PhD, Dean & Professor, University at Buffalo

Nancy J. Smyth, PhD, Dean & Professor, University at Buffalo

 

“Photographs do not explain, they acknowledge.” –Susan Sontag

Last month, the Oxford Dictionary made the word “selfie” not only an official word, but their word of the year for 2013.  Defining selfie as “a photograph that one has taken of oneself, typically one taken with a smartphone or webcam and uploaded to a social media website” the OD made explicit what has implicitly grown to be the norm of our world; a world of smartphones, self pics and social media.

Many psychotherapists and social workers have and will continue to decry this as another sign the the “narcissism” of our age.  Selfies have become synonymous with the millenials, the dumbing down of the populace by the internet, and sometimes even stretching to how Google is making us stupid.  My chosen profession has historically played fast and loose with calling people and cultures narcissistic.  Karen Horney coined the term “the neurotic personality of our time” in the 1930s, initially in part as a critique to the Freudian critique of Victorian modesty.  Kohut’s groundbreaking work on “tragic man,” and the healthy strands of narcissism in human life was co-opted within years by Lasch (1979) to describe the then-current “culture of narcissism.”  In short, even though narcissism has been a part of human being at least since Narcissus gazed into the water in Greco-Roman times, we continue to see it as perennially on the uprise.

 

Joanna Pappas, Epic MSW Student

Joanna Pappas, Epic MSW Student

 

This dovetails with each generation’s lament that the subsequent one has become more self-absorbed.  And yet, as Sontag points out, by making photography everyday, “everybody is a celebrity.”  Yep, that’s what we hate about the millennials, right?  They think everything is an accomplishment, their every act destined for greatness.  But as Sontag goes on to say, making everybody a celebrity is also making another interesting affirmation: “no person is more interesting than any other person.”

 

Jonathan Singer, Assistant Professor, Temple University

Jonathan Singer, Assistant Professor, Temple University

 

Why do many of us (therapists in particular) have a problem then with selfies?  Why do we see them as a “symptom” of the narcissism of the age?  Our job is to find the interesting in anyone, after all. We understand boredom as a countertransference response in many cases, our attempt to defend against some projection of the patient’s.  So why the hating on selfies?

I think Lewis Aron hits on the answer, or at least part of it, in his paper “The Patient’s Experience of the Analyst’s Subjectivity.”  In it he states the following:

 

I believe that people who are drawn to analysis as a profession have particularly strong conflicts regarding their desire to be known by another; that is, they have conflicts concerning intimacy.  In more traditional terms, these are narcissistic conflicts over voyeurism and exhibitionism.  Why else would anyone choose a profession in which one spends one’s life listening and looking into the lives of others while one remains relatively silent and hidden?

(Aron, A Meeting of Minds, 1996, p. 88)

 

In other words, I believe that many of my colleagues have such disdain for selfies because they secretly yearn to take and post them.  If you shuddered with revulsion just now, check yourself.  I certainly resemble that remark at times:  I struggled long with whether to post my own selfie here.  What might my analytically-minded colleagues think?  My patients, students, supervisees?  I concluded that the answers will vary, but in general the truth that I’m a human being is already out there.

 

Mike Langlois, PvZ Afficianado

Mike Langlois, PvZ Afficianado

 

Therapists like to give themselves airs, including an air of privacy in many instances.  We get hung up on issues of self-disclosure, when what the patient is often really looking for is a revelation that we have a subjectivity rather than disclosure of personal facts.  And as Aron points out, our patients often pick up on our feelings of resistance or discomfort, and tow the line.  One big problem with this though is that we don’t know what they aren’t telling us about because they didn’t tell us.  In the 60s and 70s there were very few LGBT issues voiced in therapy, and the naive conclusion was that this was because LGBT people and experiences were a minority, in society in general and one’s practice in specific.  Of course, nobody was asking patient’s if they were LGBT, and by not asking communicating their discomfort.

What has this got to do with selfies?  Well for one thing, I think that therapists are often similarly dismissive of technology, and convey this by not asking about it in general.  Over and over I hear the same thing when I present on video games–“none of my patients talk about them.”  When I suggest that they begin asking about them, many therapists have come back to me describing something akin to a dam bursting in the conversation of therapy.  But since we can’t prove a null hypothesis, let me offer another approach to selfies.

All photographs, selfie or otherwise, do not explain anything.  For example:

 

looting

 

People who take a selfie are not explaining themselves, they are acknowledging that they are worth being visible.  Unless you have never experienced any form of oppression this should be self-evident, but in case you grew up absolutely mirrored by a world who thought you were the right size, shape, color, gender, orientation and class I’ll explain:  Many of our patients have at least a sneaking suspicion that they are not people.  They look around the world and see others with the power and prestige and they compare that to the sense of emptiness and invisibility they feel.  Other people can go to parties, get married, work in the sciences, have children, buy houses, etc.  But they don’t see people like themselves prevailing in these areas.  As far as they knew, they were the only biracial kid in elementary school, adoptee in middle school, bisexual in high school, trans person in college, rape survivor at their workplace.

So if they feel that they’re worth a selfie, I join with them in celebrating themselves.

As their therapist I’d even have some questions:

  • What were you thinking and feeling that day you took this?
  • What do you hope this says about you?
  • What do you hope this hides about you?
  • Who have you shared this with?
  • What was their response?
  • What might this selfie tell us about who you are?
  • What might this selfie tell us about who you wish to be?
  • Where does that spark of belief that you are worth seeing reside?

In addition to exploring, patients may find it a useful intervention to keep links to certain selfies which evoke certain self-concept and affect states.  That way, if they need a shift in perspective or affect regulation they can access immediately a powerful visual reminder which says “This is possible for you.”

Human beings choose to represent themselves in a variety of ways, consciously and unconsciously.  They can be whimsical, professional, casual, friendly, provocative, erotic, aggressive, acerbic, delightful.  Are they projections of our idealized self?  Absolutely.  Are they revelatory of our actual self? Probably.  They explain nothing, acknowledge the person who takes them, and celebrate a great deal.  If there is a way you can communicate a willingness see your patient’s selfies you might be surprised at what opens up in the therapy for you both.

 

Melanie Sage, Assistant Professor, University of North Dakota

Melanie Sage, Assistant Professor, University of North Dakota

 

In other posts I have written about Huizinga’s concept of play.  Rather than as seeing selfies as the latest sign that we are going to hell in a narcissistic handbasket, what if we looked at the selfie as a form of play? Selfies invite us in to the play element in the other’s life, they are not “real” life but free and unbounded.  They allow each of us to transcend the ordinary for a moment in time, to celebrate the self, and share with a larger community as a form of infinite game.

It may beyond any of us to live up to the ideal that no one is less interesting than anyone else in our everyday, but seen in this light the selfie is a renunciation of the cynicism I sometimes see by the mental health professionals I meet.  We sometimes seem to privilege despair as somehow more meaningful and true than joy and celebration, but aren’t both essential parts of the human condition?  So if you are a psychotherapist or psychoeducator, heed my words:  The Depth Police aren’t going to come and take your license away, so go out and snap a selfie while everyone is looking.

Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

The Internet Is Not A Meritocracy, That’s Why You Hate It

lightbulb

Recently, I had a discussion with a student about social media, and the fact that I usually start off a comment on a blog with “great post!”  She noted two things:  First, that it rang false to her initially, making her wonder if I even read the posts people write; and second, that despite this initial impression she found herself commenting anyway.  So let me define what a great post is.

A great post is one that captures your interest and keeps the thoughtful discourse going.

Now many of my academic readers are going to vehemently disagree.  They may disagree with this blog post entirely, and you know what?  If they comment on it, I’ll publish the comment.  Because the comment keeps the discourse going.

Also recently, I was explaining my pedagogy to colleagues who were questioning my choice to assign a whole-class group assignment for 25% of the student grade.  The concern was that by giving the class a grade as a whole I would run the risk of grade inflation.  This is a real concern for many of my peers in academia and I respect that, and as someone who believes in collaboration I intend to balance advocating for my pedagogical view with integrating the group’s discerning comments and suggestions.  In my blog however, let me share my unbridled opinion on this.

I don’t care about grade inflation.

Really, I don’t.  I went to a graduate school which didn’t have grades, but had plenty of intellectual rigor.  I am more concerned with everyone having a chance to think and discuss than ranking everyone in order.  That is my bias, and that is one reason I like the internet so much.

The old model of education is a meritocracy, which according to OED is:

Government or the holding of power by people chosen on the basis of merit (as opposed to wealth, social class, etc.); a society governed by such people or in which such people hold power; a ruling, powerful, or influential class of educated or able people.

 

I think that Education 2.0 has many of us rethinking this.  Many of our students were indoctrinated into that view of education that is decidedly meritocratic.  I suspect this was part of what was behind my student’s skepticism about “great post!”  My role as an educator in a meritocracy is to evaluate the merit of these comments and ideas, rank them and award high praise only to those which truly deserve it.  By great posting everything I demean student endeavors.

One of my colleagues Katie McKinnis-Dietrich frequently talks about “finding the A in the student.”  This interests me more than the finite game of grading.  Don’t get me wrong, I do offer students choices about how to earn highest marks in our work together, I do require things of them; but I try hard to focus more on the content and discourse rather than grades.

I frequently hear from internet curmudgeons that the internet is dumbing down the conversation.  The internet isn’t dumbing down the conversation:  The internet is widening it.  Just as post-Gutenberg society allowed literacy to become part of the general population, Web 2.0 has allowed more and more human beings to have access to the marketplace of ideas.  We are at an historic point in the marketplace of ideas, where more intellectual wares are being bought and sold.  More discernment is certainly required, but the democratization of the internet has also revealed the internalized academic privilege we often take for granted.  Every ivory tower now has WiFi, and so we can experience more incidents of our sneering at someone’s grammar and picking apart their spelling.  What is revealed is not just the poor grammar and spelling of the other, but our own meritocratic tendencies.

Detractors will pointedly ask me if I would undergo surgery performed by someone who had never been to medical school, and I will readily admit that I will not.  But how can we reconcile that with the story of Jack Andraka, a 15 year-old who with no formal training in medicine created a test for pancreatic cancer that is 100 Times More Sensitive & 26,000 Times Cheaper than Current Tests.  In fact, if you listen to his TED talk, Jack implicitly tells the story of how only one of the many universities he contacted took him seriously enough to help him take this discovery to the next level.  Meritocracy in this case slowed down the process of early intervention with pancreatic cancer.  One side of this story is that this test will save countless lives; the darker side is how many lives were lost because the meritocracy refused to believe that someone who hadn’t been educated in the Scholastic tradition could have a real good idea.

I am urgently concerned with moving education further in the direction of democracy and innovation.  Any post that gets me thinking and interacting thoughtfully with others is a great post.  On a good day I remember this.

But like many academics and therapists and educators and human beings brought up in a meritocracy, I have my bad days.  Like many of you, I fear becoming irrelevant.  I resist change, whether it be the latest iOS or social mores.  Last night I caught myself reprimanding (internally) the guy wearing a baseball cap to dinner in the restaurant I was in.

We still live in a world where only students with “special needs” have individualized education plans– quite frankly, I think that everyone should have an individualized education plan.  I think our days of A’s being important are numbered.  There are too many “A students” unemployed or underemployed, too many untenured professors per slot to give the same level of privilege in our educational meritocracy.  Digital literacy is the new frontier, and I hope our goal is going to be maximizing the human potential of everyone for everyone’s sake.  Yes this is a populist vision, I think the educational “shining city on the hill” needs to be a TARDIS, with room for the inclusion of all.  I also think that those of us who have benefited from scholastic privilege will not give this privilege up easily.  We desperately want to remain relevant.

I know it is risky business putting this out in the world where my colleagues could see it.  I know this will diminish my academic standing in the eyes of many.  I know my students may read it and co-opt my argument to try to persuade me to give the highest grade.  But if I believe in discourse and collaboration I’ll have to endure that and walk the walk.

I’m not saying that every idea is a good one.  What I am saying, what I believe that has changed my life for the better is something I find more humbling and amazing about the human being:  Not every idea is a good one, but anyone, anyone at all, can have a good idea.

Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

Innovation is Dangerous & Gaming Causes Asperger’s

GamerTherapist blog is on vacation and will return with new posts after Labor Day.  In the meantime, here is a reader favorite:

At its heart, diagnosis is about exerting control.  Clinicians want to get some sense of control in understanding a problem.  We link diagnosis to prognosis to control our expectations of how likely and how much we will see a change in the patient’s condition.  Insurance companies want to get a handle on how much to spend on who.  Schools want to control access to resources and organize their student body.  And with the current healthcare situation, the government is sure to use diagnosis as a major part of the criteria in determining who gets what kind of care.

Therapists and Educators do not like to think of ourselves as controlling people.  But we often inadvertently attempt to exert control over our patients and entire segments of the population, by defining something as a problem and then locating it squarely in the individual we are “helping.”

This week has been one of those weeks where I have heard from several different colleagues about workshops they are attending where the presenters are linking Asperger’s with Gaming Addiction:  Not in the sense of “Many people on the Autism Spectrum find success and motivation through the use of video games,” but rather in the sense of “excessive gaming is prevalent in the autistic spectrum community.”

This has always frustrated me, for several reasons, and I decided its time to elaborate on them again:

1. Correlation does not imply Causation.  Although this is basic statistics 101 stuff, therapists and educators continue to make this mistake over and over.  Lots of people with Asperger’s play video games, this is true.  This should not surprise us, because lots of people play video games!  97% of all adolescent boys and 94% of adolescent girls, according to the Pew Research Center.  But we love to make connections, and we love the idea that we are “in the know.”  I can’t tell you how many times when I worked in education and clinics I heard talk of people were “suspected” of having Asperger’s because they liked computers and did not make eye contact.  Really.  If a kiddo didn’t look at the teacher, and liked to spend time on the computer, a suggested diagnosis of Autism couldn’t be far behind.  We like to see patterns in life, even oversimplified ones.

2. Causation often DOES imply bias.  Have you ever stopped to wonder what causes “neurotypical” behavior?  Or what causes heterosexuality for that matter.  Probably not.  We usually try to look for the causation of things we are busily pathologizing in people.  We want everyone to fit within the realm of what the unspoken majority has determined as normal.  Our education system is still prone to be designed like a little factory.  We want to have our desks in rows, our seats assigned, and our tests standardized.  So if your sensory input is a little different, or your neurology atypical, you get “helped.”  Your behavior is labeled as inappropriate if it diverges, and you are taught that you do not have and need to learn social skills.

Educators, parents, therapists and partners of folks on the Austism Spectrum, please repeat this mantra 3 times:

It is not good social skills to tell someone they do not have good social skills.

By the same token, technology, and video games, are not bad or abnormal either.  Don’t you see that it is this consensual attitude that there is something “off” about kids with differences or gamers or geeks that silently telegraphs to school bullies that certain kids are targets?  Yet, when an adolescent has no friends and is bullied it is often considered understandable because they have “poor social skills and spend too much time on the computer.”  Of course, many of the same kids are successfully socializing online through these games, and are active members of guilds where the stuff they hear daily in school is not tolerated on guild chat.

Let’s do a little experiment:  How about I forbid you to go to your book discussion group, poker night, or psychoanalytic institute.  Instead, you need to spend all of your time with the people at work who annoy you, gossip about you and make your life miserable.  Sorry, but it is for your own good.  You need to learn to get along with them, because they are a part of your real life.  You can’t hide in rooms with other weirdos who like talking about things that never happened or happened a long time ago; or hide in rooms with other people that like to spend hours holding little colored pieces of cardboard, sort them, and exchange them with each other for money; or hide in rooms where people interpret dreams and talk about “the family romance.”

I’m sure you get my point.  We have forgotten how little personal power human beings have before they turn 18.  So even if playing video games was a sign of Asperger’s, we need to reconsider our idea that there is something “wrong” with neuro-atypical behaviors.  There isn’t.

A lot of the work I have done with adults on the spectrum has been to help them debrief the trauma of the first 20 years of their lives.  I’ve had several conversations where we’ve realized that they are afraid to ask me or anyone questions about how to do things, because they worried that asking the question was inappropriate or showed poor social skills.  Is that really what you want our children to learn in school and in treatment?  That it is not ok to ask questions?  What a recipe for a life of loneliness and fear!

If you aren’t convinced, please check out this list of famous people with ASD.  They include Actors (Daryl Hannah,) bankers, composers, rock stars, a royal prince and the creator of Pokemon.  Not really surprising when you think about innovation.

3.  Innovation is Dangerous.  Innovation, like art, requires you to want things to be different than the way they are.  Those are the kids that don’t like to do math “that way,” or are seen as weird.  These are the “oversensitive” ones.  These are the ones who spend a lot of time in fantasy, imagining a world that is different.  These are the people I want to have over for hot chocolate and talk to, frankly.

But in our world, innovation is dangerous.  There are unspoken social contracts that support normalcy and bureaucracy (have you been following Congress lately?)  And there are hundreds of our colleagues who are “experts” in trying to get us all marching in lockstep, even if that means killing a different drummer.  When people try to innovate, they are mocked, fired from their jobs, beaten up, put down and ignored.  It takes a great deal of courage to innovate.  The status quo is not neutral, it actively tries to grind those who are different down.

People who are fans of technology, nowadays that means internet and computing, have always been suspect, and treated as different or out of touch with reality.  They spend “too much time on the computer,” we think, until they discover the next cool thing, or crack a code that will help fight HIV.  Only after society sees the value of what they did do they get any slack.

Stop counting the hours your kid is playing video games and start asking them what they are playing and what they like about it.  Stop focusing exclusively on the “poor social skills” of the vulnerable kids and start paying attention to bullies, whether they be playground bullies or experts.  Stop worrying about what causes autism and start worrying about how to make the world a better place for people with it.

Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

Dopey About Dopamine: Video Games, Drugs, & Addiction

Last week I was speaking to a colleague whose partner is a gamer. She was telling me about their visit to his mother. During the visit my colleague was speaking to his mother about how much he still enjoys playing video games. His mother expressed how concerned she had been about his playing when he was young. “It could have been worse though,” she’d said, “at least he wasn’t into drugs.”

This comparison is reminiscent of the homophobic one where the tolerant person says, “I don’t mind if you’re gay, as long as you don’t come home with a goat.” The “distinction” made actually implies that the two things are comparable. But in fact they are not.

Our culture uses the word addiction pretty frequently and casually. And gamers and opponents of gaming alike use it in reference to playing video games. Frequently we hear the comments “gaming is like a drug,” or “video games are addictive,” or “I’m addicted to Halo 3.” What muddies the waters further are the dozens of articles that talk about “proof” that video games are addictive, that they cause real changes in the brain, changes just like drugs.

We live in a positivistic age, where something is “real” if it can be shown to be biological in nature. I could argue that biology is only one way of looking at the world, but for a change I thought I’d encourage us to take a look at the idea of gaming as addictive from the point of view of biology, specifically dopamine levels in the brain.

Dopamine levels are associated with the reward center of the brain, and the heightened sense of pleasure that characterizes rewarding experiences. When we experience something pleasurable, our dopamine levels increase. It’s nature’s way of reinforcing behaviors that are often necessary for survival.

One of the frequent pieces of evidence to support video game addiction is studies like this one by Koepp et al, which was done in 1998. It monitored changes in dopamine levels from subjects who were playing a video game. The study noted that dopamine levels increased during game play “at least twofold.” Since then literature reviews and articles with an anti-gaming bias frequently and rightly state that video games can cause dopamine levels to “double” or significantly increase.

They’re absolutely right, video games have been shown to increase dopamine levels by 100% (aka doubling.)

Just like studies have shown that food and sex increase dopamine levels:

This graph shows that eating food often doubles the level of dopamine in the brain, ranging from a spike of 50% to a spike of 100% an hour after eating. Sex is even more noticeable, in that it increases dopamine levels in the brain by 200%.

So, yes, playing video games increases dopamine levels in your brain, just like eating and having sex do, albeit less. But just because something changes your dopamine levels doesn’t mean it is addictive. In fact, we’d be in big trouble if we never had increases in our dopamine levels. Why eat or reproduce when it is just as pleasurable to lie on the rock and bask in the sun?

But here’s the other thing that gets lost in the spin. Not all dopamine level increases are created equal. Let’s take a look at another chart, from the Meth Inside-Out Public Media Service Kit:

This is a case where a picture is worth a thousand words. When we read that something “doubles” it certainly sounds intense, or severe. But an increase of 100% seems rather paltry compare to 350% (cocaine) or 1200% (Meth)!

One last chart for you, again from the NIDA. This one shows the dopamine increases (the pink line) in amphetamine, cocaine, nicotine and morphine:

Of all of these, the drug morphine comes closest to a relatively “low” increase of 100%.

So my point here is twofold:

1. Lots of things, not all or most of them drugs, increase the levels of dopamine.

2. Drugs have a much more marked, sudden, and intense increase in dopamine level increase compared to video games.

Does this mean that people can’t have problem usage of video games? No. But what it does mean, in my opinion, is that we have to stop treating behaviors as if they were controlled substances. Playing video games, watching television, eating, and having sex are behaviors that can all be problematic in certain times and certain contexts. But they are not the same as ingesting drugs, they don’t cause the same level of chemical change in the brain.

And we need to acknowledge that there is a confusion of tongues where the word addiction is involved. Using it in a clinical sense is different than in a lay sense– saying “I’m hooked on meth” is not the same as saying “I’m hooked on phonics.” Therapists and gamers alike need to be more mindful of what they are saying and meaning when they say they are addicted to video games. Do they mean it is a psychological illness, a medical phenomenon? Do they mean they can’t get enough of them, or that they like them a whole lot? Do they mean it is a problem in their life, or are they parroting what someone else has said to them?

I don’t want to oversimplify addiction by reducing it to dopamine level increase. Even in the above discussion I have oversimplified these pieces of “data.” There are several factors, such as time after drug, that we didn’t compare. And there are several other changes in brain chemistry that contribute to rewarding behavior and where it goes awry. I just want to show an example of how research can be cited and misused to distort things. The study we started out with simply found that we can measure changes in brain chemistry which occur when we do certain activities. It was not designed or intended to be proof that video games are dangerous or addictive.

Saying that something changes your brain chemistry shouldn’t become the new morality. Lots of things change your brain chemistry. But as Loretta Laroche says, “a wet towel on the bed is not the same as a mugging.” We need to keep it complicated and not throw words around like “addiction” and “drug” because we want people to take us seriously or agree with us. That isn’t scientific inquiry. That’s hysteria.

Find this video interesting? I can speak in person too:  Check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

 

Breaking Eggs and Holding Your Fire: Some Thoughts on Skills Acquisition

cod-sniper

Not too long ago, I was learning how to fire a sniper rifle in Call of Duty. It wasn’t going very well. I kept firing (which you do by holding down the right-hand trigger) and missing. Or I would use the scope, which you do by holding down the left-hand trigger; and then try to find my target so slowly that I’d get shot long before seeing it. To make thing more complicated, my patient Gordon** was trying to teach me the difference between “hardscoping” which meant to press and hold down the left trigger, and “quickscoping” which was more like a quick tap and release of the scope.

The key to success, I was told, was to locate the target, quickscope it for a second to take aim, and then fire. The source of my failure was that I’d see the target and not bother to scope at all, and just fire. At first I didn’t even know I was doing that. I thought the scope was going up, and it was, but it was going up a split second after I was firing and not before.  After several fumbled attempts Gordon said, “you have to not fire and learn to push the scope first instead.”  I suddenly realized that he was teaching me about impulse control.

Because many parents and therapists are reluctant to play video games, in particular first-person-shooters, they only tend to see them from outside the experience.  What they learn from seeing that way is that FPS are full of violence, mayhem, blood and noise.  Is it any wonder then that they grow concerned about aggression and the graphic nature of the game?  It’s all that is really available to them unless there is a strong plot line and they stick around for that.

But as someone who has been playing video games for years I can tell you things are different from within the experience.  And one of the most counterintuitive things I can tell you from my experience is this: First Person Shooters can help you learn impulse control.  It takes a lot more impulse control to not fire at a target the second you see it.  It takes a lot more impulse control to wait and scope.  And because all of these microdecisions and actions take place within the player’s mind and the game experience, outside observers see violence and aggression alone and overlook the small acts of impulse control the player has to exert over and over again.

Any therapist who has worked with adolescents, people with ADHD, personality disorders and a host of other patient types understands the importance of learning impulse control. That act of mindfulness, that ability to create a moment’s space between the situation and the patient’s reaction to it is necessary to help people do everything from their homework to suicide prevention.  In addition, there is always a body-based aspect to impulse control, however brief or small, and so to create that space is to forge a new and wider relationship between mind and body.

All of this was going on as we were playing Xbox. Over and over again, I was developing, practicing impulse control from behind that virtual sniper rifle.  Again and again I was trying to recalibrate my bodily reflexes and sensations to a new mental model.  Don’t fire.  When my kill score began to rise, it wasn’t because my aim had gotten better, it was because my impulse control had.

Meanwhile, for the past two weeks I have been practicing making omeletes.

In particular, I have been learning how to make an omelette roulée of the kind Julia Childs makes below (you can skip to 3:30 if you want to go right to the pan.)

This type of omelette requires the ability to quickly (in 20-30 seconds) tilt and jerk the pan towards you multiple times, and then tilting the pan even more to flip it.  Doing this over the highest heat the movement needs to be quick and reflexive or you end up tossing a scrambled eggy mess onto the burner.  I can’t tell you how tense that moment is when the butter is ready and you know that once you pour in the egg mixture there is no going back.  To jerk the pan sharply towards you at a tilt seems so counterintuitive, and this is an act of dexterity, meaning that your body is very involved.

In a way an omelette roulée requires impulse control just like Call of Duty in order to learn how to not push the pan but pull it toward you first.  But just as importantly, making this omelette requires the ability to take risks.  It can be scary to make a mess, what happens if the eggs fly into the gas flame?!

Let me tell you, because I now know what happens:  You turn off the flame, wait a minute and wipe off the messy burner.  And then you try again.

Adolescents, all people really, need to master both of these skills of impulse control and risk-taking.  To do so means widening the space in your mind between situation and action, but not let that space become a gaping chasm impossible to cross.  Learning impulse control also happens within experience, not in a special pocket universe somewhere apart from it.  Learning risk-taking requires the same.  And at their core they are bodily experiences, which may be what Freud meant when he said that the ego was first and foremost a body ego.

When I worked in special education settings, I was often called on to restrain children in crisis.  Afterwards we would usually do a postvention: “What was happening?” “How could you do things differently next time?”  We were looking at their experience from the outside, constructing a little pocket universe with words, as if we understood what had been going on in the experience, in the body and psyche of the child.  I doubt these post-mortems taught impulse control.

I wonder what might have happened if we had risked throwing some eggs on the fire and encouraged the kids to play first person shooters or other video games.  If my theory is right, then we would have been cooking.

**Not his real name. Name, age, gender and other identifying information have been altered to preserve confidentiality.

Mike is on vacation until September, which means that he has started talking in the third person at the end of blog posts.  It also means that the next new post will be next month.  He’ll repost an old fave or book excerpt to tide you over in the meantime.

 

Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

“Can I Kill You Again Today?”: The Psychoanalysis of Player Modes

2057142-shepards

In 1947, Virginia Axline published the first edition of what  was to become a seminal work in the field it was named for, Play Therapy.  In her book she championed the concept of non-directive play, the form of play therapy where the therapist takes in some ways a very Rogerian approach of reflecting rather than directing the play either overtly or subtly.

This is easier said than done, as I learned when I started using it as an intern.  I recall watching a youngster play and describe a family in a horrible car accident.  My first comment was, “are they all right?” covertly signalling to the child that I was anxious in the presence of such violence and the possibility of death.  The child reassured me that the family was okay, and I am convinced that I had essentially ruined that session’s treatment.  Fortunately I was lucky to have an amazing supervisor, Linda Storey (great name for a therapist too!) who helped me to learn how to truly be non-directive.  Over the next year and since I have greeted tornadoes, murder, floods, monster attacks, plane crashes, burning buildings and other disasters with “what happens next?”

Non-directive play therapy is still at it’s heart a two-part invention between the therapist and the patient.  However, unlike some other forms of treatment, it requires the therapist to be able to tolerate a lot of violence and anxiety.  Trying to direct children away from their aggressive fantasies and desires is often rooted in the therapist’s own anxiety about them.  Let’s face it, for many of us death and destruction are scary things.  It isn’t just a rookie mistake to ask the child to make the story turn out “okay,” and yet I think it has never been more urgent for therapists to be able to tolerate violent fantasy and encourage it to unfold in the play.

21st Century Play

Virginia Axline never had to contend with Call of Duty Special Ops, Modern Warfare or Battlefield 3.  What was different about 20th Century play therapy was that the games in the consulting room usually resembled the ones from the child’s everyday life at home or school.  The therapists therefore knew how to play them, and didn’t necessarily need to learn them as they went.  But now we are in the 21st century, where the therapy office often has games from our childhoods rather than those of our patients, and they are very different.

If you are a therapist and never intend to learn to play video games and play them with your patients, you should probably stop reading here; the post won’t be useful to you and I’ll probably annoy you.  But if you don’t plan on using video games with your young patients I hope you’ll consider stopping doing play therapy with children as well.  Certainly stop calling yourself a non-directive play therapist, because you’ve already directed the child’s play away from their familiar games and away from this century.  I actually hope, though, that you will lean into the places that scare you and try to meet your patients where they are at in their play, and for 97% of boys and 94% of girls that means video games.

Video games like Call of Duty and Minecraft are both very useful in both diagnosis and treatment of patients, as I hope to demonstrate by focusing just on one aspect here, that of player modes.  Most video games have a range of player modes, and what the patient chooses can say a lot about their attachment styles, selfobject needs, and object relations.

Solo Play is OK

Like other forms of play, sometimes patients want to play alone, and have me bear witness to their exploits.  They may do so out of initial mistrust, or a yearning for mirroring.  Solo play is looked down on by some therapists, who often think kids using “the computer” are austitic and/or “stuck” in parallel play.  I’d refer you to Winnicott, who taught us that it is a developmental achievement to be alone in the presence of another.  (I’d also refer you to my colleague and therapist Brian R. King who has a lot to say about a strengths-based approach to people on the autistic spectrum, on which he includes himself.)

The Many Reasons to Collaborate.

Some patients want to play with me on the same team in first person shooter games.  The reasons for this can vary.  Some patients want to protect me from their aggression because they are afraid I’ll be scared of it like parents, teachers and other adults may have been.  Other patients want to be on the same team because they want  to have a merger with an idealized parent imago to feel more powerful and able to take on the game.  Still other patients, seen in their daily lives as oppositional or violent, want to play on the same team so they can revive me and have me experience them as nurturing and a force for good in the world.

Some patients  want to have their competition framed by overall collaboration, meaning that they want to get the most or final “kills” but remain on the same team.  Some patients secretly yearn to play on a different team, and may need to “accidentally” change the settings to put us on opposing teams and passively want the game to continue.

Let’s Bring On A World of Hurt.

On the other hand, there are a lot of reasons patients want to compete.  They may want to see if I can stand their aggression and/or desire to win without being annihilated.  They may want to express their sadism by tormenting me for my lack of skill, or alternately project their yearnings for recognition by praising me when I kill them.  They may want to see how I manage my frustration when playing, and interpret that frustration as investment in the game and therefore my relationship with them.  They may be watching very carefully to see how I act when I win or lose.  Do I gloat when I win?  Do I make excuses when I lose?  How might these behaviors be understood by children and adolescents who often feel like they are chronically losing and behind their peers in the game of education?

More questions arise:  Does the patient ask me what mode I want to play or simply decide on one?  Do they modulate their anxiety by playing a combat mode but expressing the desire to stay away from the zombie mode?  By allowing them to do that am I helping them to learn that sometimes life is about choosing the lesser of two anxieties rather than avoiding anxiety altogether?

Multiplayer and Uninvited Guests

In terms of settings, there is some direction on my part, which is part of maintaining the therapeutic frame.  I make it a requirement that we play either locally or in a private game.  And of course this sometimes go wrong, with a random player joining us.

What to do then?  What if we are on an extremely high level and just terminating the game will do more harm than good?  In that case I make sure we are on mute and the our conversation can’t be heard by the added player, and then things get even more interesting in the therapeutic conversation:  Does the patient have any feelings about the new player’s arrival?  What do they imagine the usertag “NavySeal69” means anyway?  Do we help them when they are down or try to ignore them?  How do we feel if they are ignoring us?  Do we team up against them?

Minecraft and the Repetition Compulsion.

I could probably write a whole post or paper on this, but for know let’s talk about creative mode and griefing.  In Minecraft you and other players can build things alone or together.  Other players can also “grief” you, meaning cause you grief by destroying your structures and setting you back after a lot of hard work.   What does it mean when a patient griefs my building, apologizing and promising not to grief it if I rebuild, then griefs it over and over again?  What may be being reenacted here?  Are there adults in the patient’s life who tear her/him down again and again?  When does one give up on any hope for honesty or compassion from the other?  What sort of object are they inviting me to become to them; angry, patient, gullible, limit-setting, mistrustful?

I have used the term child or adolescent here, but exploring the gameplay of adults when they describe it to me is often useful as well.  I often encourage my adult students or gamer readers to do a little self-analysis on their play-style?  What does your preferred mode of moving through video games say about you?  What questions does it invite you to explore?

The goal here is not to give you an explicit case presentation or analysis of one hypothetical patient or game.  Rather, it is to provide you with a Whitman’s Sampler of practice and theory nuggets to give you a taste of the richness you are missing if you don’t play video games with your patients, especially if you are a psychodynamic therapist.  There is a lot that “happens next” if you engage with your patients in 21st century play that has themes you may find familiar:  How do I live in a world that can be hostile to me?  Why should I trust you to be any different?  Will my badness destroy or repulse you?  Will you hurt me if I am vulnerable?  These and dozens of other fascinating and relevant themes emerge in a way that never did for me when I forced kids to endure 45 minutes of the Talking, Feeling, Doing Game.  And what’s more you don’t have to remember to take the “What Do You Think About a Girl Who Sometimes Plays with or Rubs Her Vagina When She’s Alone?” card out of the deck.

I’m not THAT non-directive.  🙂

 

Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

Saving Ideas

cave painting

Sometime, over 40,000 years ago, someone decided to put images of human hands on the cave pictured above.  It turned out to be a good idea.  This painting has given scientists information on life in the Upper Paleolithic, raised questions about the capacity of Neandrathal man to create art, and sparked debate about which species in the homo genus created it.  Other later cave paintings depict other ideas: Bulls, horses, rhinoceros, people.

I wasn’t there in the Paleotlithic but I doubt that the images we are seeing in caves were the first ones ever drawn.  I imagine that drawing images in sand and other less permanent media happened.  I suspect that the only reason we have cave paintings is because at some point somebody decided they wanted to be able to save their idea, to keep it longer or perhaps forever.

Every day, 7 billion of us have untold numbers of ideas.  So what makes a person decide that an idea is worth saving?  What makes us pause and make a note in our Evernote App or Moleskine journal?  What inspires us to make a video of our idea on YouTube or write a book?  We can’t always be sure that an idea is a “good” one or even what the criteria for a good idea is.  It usually comes down to belief.

In the past several centuries, the ability to save ideas was relegated to the few who were deemed skillful or divinely inspired.  Books were written in monasteries, then disseminated by printing presses, and as ideas became easier to save, more people saved them.  But, and this is very important, saving an idea doesn’t make it a good idea, just a saved one.  Somewhere along the line we began to get the notion that only a few select people were capable of having a good idea, because only a few select people were capable of saving them.  Even in the 21st century, many mental health professionals and educators cling to the notion that peer-reviewed work published in journals is the apex of quality.  If it is written, if it was saved by a select few it must be a good idea.  If you have any doubt of what I’m talking about just Google “DSM V.”

With each leap in human technology comes the power to save more ideas and then spread them.  People who talk about things going viral often forget that an idea has to be saved first, and that in essence something going viral is really a form of society saving an idea.  If anything, technology has improved the democratization of education and ideas.

This makes many of us who grew up in an earlier era nervous and frustrated.  We call the younger generation self-absorbed rather than democratizing.  We grumble, “what makes you think you should blog about your day, take photos of your food, post links to cute kitten videos?”  We may even take smug self-satisfaction that we aren’t contributing to the static.  I think that’s a bad idea, although it clearly has been saved from earlier times.

40,000 years from now, our ideas may take on meanings we never anticipated, like cave drawings.  Why were kittens so important to them?  In the long view I think we remember that people have to believe they have an good idea before they take the leap of faith to save it.  The citizens of the future may debate who saved kitten videos and why, but it will be taken as given that they must have been important to many of us.

What if everyone had the confidence to believe that they had an idea worth saving?  What if everyone had the willingness to believe that it just might be possible that their idea was brilliant?  Each semester I ask the students in my class to raise their hand if they think they can get an A- or higher in the class, and most do.  Then I ask them to raise their hand if they think they can come up with in an idea in this class that could change the world.  I’ve never had more than 3 hands go up.  That’s sad.

This is why I admire the millennials and older groups who take advantage of social media and put their ideas out there.  I doubt that they are all good ideas, but I celebrate the implicit faith it takes to save them.  Anyone, absolutely anyone at all, can have a good idea.  It may not get recognized or appreciated, but now more than ever it can get saved.  Saving an idea is an act of agency.  It is a political act.  Saving an idea is choosing to become just a bit more visible.  On the most basic level saving an idea is a celebration and affirmation of the self.  Think about that, and dare to jot down, draw, record or otherwise save one of your ideas today.  I just did and it feels great.  Then maybe you can even share it with someone else.

What makes a person decide an idea is worth saving?

You do.

 

Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

Bad Object Rising: How We Learn to Hate Our Educated Selves

Recently I had the opportunity to work with a great set of educators in a daylong seminar.  One of the things I do with teachers when I present is have them play Minecraft.  In this case I started off by giving a general presentation that ended with a story of auto-didactism in an Ethiopian village, where 20 children who had never seen the printed word were given tablets and taught themselves to read.  I did this in part to frame the pedagogy for what came next:  I had them turn on Minecraft and spend 30 minutes exploring the game without any instruction other than getting them networked.

The responses were as varied as the instructors, but one response fascinated me in particular.  Midway into the 30 minutes, one teacher stopped playing the game and started checking her email.  Later, when we returned to our group to have a discussion about the thoughts and feelings that came up around game play, this same teacher spoke up.  We were discussing the idea of playfulness in learning when she said , “you know, I hear a lot about games and learning, and making learning fun; but sometimes learning isn’t fun and you have to do it anyway.  Sometimes you just have to suck it up and do the work.”

“I’m not saying that I disagree with you entirely,”  I said.  “But then how do we account for your giving up on Minecraft and starting to check your email?”

She looked a little surprised, and after a moment’s reflection said, “fair enough.”

I use this example because these are educators who are extremely dedicated to teaching their students, and very academically educated themselves.  Academia has this way, though, of seeping into your mind and convincing you that academics and education are one and the same.  They’re not.

I worked in the field of Special Education for more than a decade from the inside of it, and one of the things I came to believe is that there are no unteachable students.  That is the good news and the bad news.  Bad news because if a student was truly unteachable, they wouldn’t learn from us that they are dumb or bad if they don’t demonstrate the academic achievement we expect.  I remember the youth I worked with calling each other “SPED monkeys” as an insult; clearly they learned that from somewhere and someone.  They had learned to hate themselves as a bad object, in object relations terms, or to project that badness onto other students.  They learned this from the adults around them, from the microaggressions of hatred they experienced every day:  By hate I’ll go with Merriam as close enough, “intense hostility and aversion usually deriving from fear, anger, or sense of injury.”

We tend to mistakenly equate hatred with rage and physical violence, but I suggest that this is because we want to set hatred itself up as hated by and other from ourselves; surely we never behave that way.  But hatred is not always garbed in extremis.  Hatred appears every day to students who don’t fit the academic mold.  Hatred yells “speak English!” to the 6 year olds getting off the bus chatting in Spanish.  Hatred shakes its head barely (but nevertheless) perceptibly before moving on to the next student when the first has fallen silent in their struggle.  Hatred identifies the problem student in the class and bears down on her, saying proudly, “I don’t coddle my students.”  And Hatred shrugs his shoulders when the student has been absent for 3 weeks, and waits for them to be dropped from the rolls.

I’m not sure how I came to see this, because I was one of the privileged academically.  I got straight A’s, achieved academic awards and scholarships that lifted me into an upperclass world and peer group.  I wrote papers seemingly without effort, read for pleasure, and was excited to get 3 more years of graduate school.  And I have had the opportunity to become an educator and an academic myself, having taught college and graduate students.  I could have stayed quiet and siloed in my area of expertise, but work with differently-abled learners taught me something different.  It taught me that people learn to dislike education, shortly after academia learns to dislike them.

Perhaps one of the best literary portrayals of  adult hatred of divergent thinkers comes from the movie Matilda:

“Listen, you little wise acre. I’m smart, you’re dumb; I’m big, you’re little; I’m right, you’re wrong. And there’s nothing you can do about it.”

Nowadays I teach in a much different way than I did early on, before I flipped my classrooms and facilitated guided learning experiences rather than encourage people to memorize me and ideas that I had memorized from others.  And I struggle with this new approach, because I enjoy it so much I feel guilty.  You see, I have internalized the bad object too.  Even with my good grades I internalized it.  And any time I start to depart from the traditional mold of the educated self, I experience a moment of blindness, then a stony silence that seems to say, “you’re being lazy, you should make them a powerpoint and prepare a lecture.”  Yet, if my evaluations on the whole and student and colleague testimonies have truth to them, I am a “good” educator.  So let’s say I am a “good” educator, and if I as a good educator struggle with this, we shouldn’t assume that people that struggle with these issues are “bad” educators.

In fact, when it comes to emerging technologies like social media and video games, educators often try to avoid them, if not because they are fun and suspect, then because educators risk experiencing themselves as the bad object: Who wants to experience themselves as hopelessly dumb, clumsy or lazy when they can experience themselves as the bountiful and perfectly cited fount of all wisdom?  Truth is, both are distorted images of the educated self.

Don’t forget that educators themselves experience tons of societal hatred.  For them it often comes in the guise of curriculum requirements or linking their performance to outcomes on standardized testing.  Hatred comes in the low salaries and the perception that people doing intellectual or emotional labor aren’t really working.  All of this helps educators to internalize a bad object which feels shaming and awful; is it any wonder that we sometimes unconsciously try to get that bad object away from ourselves and locate it in the student?

The good news as I said before is that we are all teachable.  We can learn to make conscious and make sense of the internalized bad object representations.  We can see that thinking of people in terms of smart or dumb is a form of splitting.

And yes, there’s a lot we can do about it.

Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!