Bringing Emerging Technology into the Clinical Process: Implications for Engagement and Treatment

If you have ever wondered how to begin attending to, listening for, and asking questions about a patient’s use of technology, this video might give you some ideas.  In it my colleague Lesa Fichte, LMSW, University at Buffalo School of Social Work, and I, discuss the role of technology, people’s relationship with technology, and how to integrate it into the treatment process by listening, inquiring, and learning.

 

Find this video interesting? I can speak in person too:  Check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

The Relationship Between Emerging Technology & Psychodynamic Theory

Often when I present, people are surprised that I teach on both emerging technologies such as social media and video games, and classic psychodynamic theories.  Although it may initially seem counterintuitive, especially to classically trained psychotherapists and social workers, I see a strong connection between the two.  Here is the first in a series of posts featuring work I am doing with the University at Buffalo, in which Charles Syms and I discuss the relationship between the two.

 

Find this video interesting? I can speak in person too:  Check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

Social Justice & Technology Revisited

mccormick_reaper

I have written before about how technology often makes life easier for a large number of the population while simultaneously disenfranchising others.  The good news is that this does not have to be the case.

The example I used in the past was the Starbucks App which allows customers to use, gain rewards for, and reload their account on their smartphone, while making it more cumbersome and difficult to tip baristas.  This again does not have to be an inevitability, but requires Starbucks to enhance the functionality of its App.

So I was pleased to discover (special thanks to my student Marissa for bringing this to my attention) this week that come Wednesday March 19th, Starbucks will be rolling out an update to their smartphone App which allows just that.  You can read more about it at Forbes here.  You will be able to download the update from places like iTunes, and include your tip easily.

While some may dismiss this as a first-world problem, I cannot emphasize how powerful a shift I consider this to be in terms of workers’ rights in the service sector.  I am convinced it comes in part as a result of advocacy by and for workers, and sets the bar higher and yet attainable for corporations to maximize their value to customers while not disenfranchising their employees.

How can you help advocate for social justice in the technology you use?  First, simply by mindful usage.  Take a few minutes today to open your smartphone and make note of the Apps you use most frequently.  Next, ask yourself, who, if anyone is disadvantaged by my using this App?  Just thinking about the connections can be a powerful mental exercise.  Notice how complicated it can get fairly quickly:  If I use Evernote frequently, I am less likely to write things down on paper, which may be good for the environment but may also disenfranchise industrial workers in paper mills.  Hold on, did I say that you had to stop using Evernote or lobby for paper mills?  No, I’m asking us to sit with the complexity of a problem here for a minute to see the larger systems at play.  Technology has always resulted in job loss for some even as it may provide workplace improvements or quality of life for others.  It’s when we don’t think about these things in a more complex way that we stop innovating social justice itself.

Part of what I’m trying to encourage us to see is that social justice, workers’ rights, unions, and any person or group committed to social justice needs to keep pace with innovation and in fact keep innovating themselves.  Technology always runs the risk of disenfranchising people, especially workers.  If the McCormick reaper in a few hours does the day’s job of three workers, what happens to those three workers?  We are still living in a capitalist society in the US, and it is unlikely that as technology improves and reduces the need for human workers that all of these people will be able to afford to turn their minds and lives to the pursuit of art and culture.  Everything isn’t always getting better for everyone in the current system, and we are seeing overcrowding in occupations ranging from factory to legal work.

If social justice advocates, and social workers are to continue to help the disenfranchised, they are going to need to keep pace with technological developments and continue to think innovatively about 21st century equity in complex and sustained ways.  And by the way, thinking, “the gap is just going to get wider, the social fabric is unraveling,” is not an example of innovative thinking, but defeatism that exempts us from the work of innovation.

This brings me back to my social work colleagues, and my continued urging for them to keep pace with emerging technologies, especially if you are touting the concept of social innovation.  Social innovation without leveraging emerging technology will ultimately lead to future disenfranchisement.  If you have a social innovation department in your social work program that doesn’t leverage technology you are not being socially innovative.  I certainly don’t have all the answers, but I know that the answer to social injustice will inevitably need to integrate emerging technology into it.

Like this post?  I can speak in person too:  Check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

Reality Testing & The 7 Billion Rule

In this video, I discuss the ego function of reality testing, how it affects us, and ways to cope with distortions in it.  This is also another example of how I use technology, in particular YouTube as a transitional object for patients, allowing them to continue to remember our work together without compromising any of their personal health information.

This will be the last post for 2013, have a good end of the year and I’ll see you sometime in late January!

 

Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

Obama, Selfies, Projections & Death

In this video Mike Langlois, LICSW gives an analysis of what the furor around President Obama’s selfie at Mandela’s funeral could say, not about him, but us.

 

 

Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

Selfie Esteem

Nancy J. Smyth, PhD, Dean & Professor, University at Buffalo

Nancy J. Smyth, PhD, Dean & Professor, University at Buffalo

 

“Photographs do not explain, they acknowledge.” –Susan Sontag

Last month, the Oxford Dictionary made the word “selfie” not only an official word, but their word of the year for 2013.  Defining selfie as “a photograph that one has taken of oneself, typically one taken with a smartphone or webcam and uploaded to a social media website” the OD made explicit what has implicitly grown to be the norm of our world; a world of smartphones, self pics and social media.

Many psychotherapists and social workers have and will continue to decry this as another sign the the “narcissism” of our age.  Selfies have become synonymous with the millenials, the dumbing down of the populace by the internet, and sometimes even stretching to how Google is making us stupid.  My chosen profession has historically played fast and loose with calling people and cultures narcissistic.  Karen Horney coined the term “the neurotic personality of our time” in the 1930s, initially in part as a critique to the Freudian critique of Victorian modesty.  Kohut’s groundbreaking work on “tragic man,” and the healthy strands of narcissism in human life was co-opted within years by Lasch (1979) to describe the then-current “culture of narcissism.”  In short, even though narcissism has been a part of human being at least since Narcissus gazed into the water in Greco-Roman times, we continue to see it as perennially on the uprise.

 

Joanna Pappas, Epic MSW Student

Joanna Pappas, Epic MSW Student

 

This dovetails with each generation’s lament that the subsequent one has become more self-absorbed.  And yet, as Sontag points out, by making photography everyday, “everybody is a celebrity.”  Yep, that’s what we hate about the millennials, right?  They think everything is an accomplishment, their every act destined for greatness.  But as Sontag goes on to say, making everybody a celebrity is also making another interesting affirmation: “no person is more interesting than any other person.”

 

Jonathan Singer, Assistant Professor, Temple University

Jonathan Singer, Assistant Professor, Temple University

 

Why do many of us (therapists in particular) have a problem then with selfies?  Why do we see them as a “symptom” of the narcissism of the age?  Our job is to find the interesting in anyone, after all. We understand boredom as a countertransference response in many cases, our attempt to defend against some projection of the patient’s.  So why the hating on selfies?

I think Lewis Aron hits on the answer, or at least part of it, in his paper “The Patient’s Experience of the Analyst’s Subjectivity.”  In it he states the following:

 

I believe that people who are drawn to analysis as a profession have particularly strong conflicts regarding their desire to be known by another; that is, they have conflicts concerning intimacy.  In more traditional terms, these are narcissistic conflicts over voyeurism and exhibitionism.  Why else would anyone choose a profession in which one spends one’s life listening and looking into the lives of others while one remains relatively silent and hidden?

(Aron, A Meeting of Minds, 1996, p. 88)

 

In other words, I believe that many of my colleagues have such disdain for selfies because they secretly yearn to take and post them.  If you shuddered with revulsion just now, check yourself.  I certainly resemble that remark at times:  I struggled long with whether to post my own selfie here.  What might my analytically-minded colleagues think?  My patients, students, supervisees?  I concluded that the answers will vary, but in general the truth that I’m a human being is already out there.

 

Mike Langlois, PvZ Afficianado

Mike Langlois, PvZ Afficianado

 

Therapists like to give themselves airs, including an air of privacy in many instances.  We get hung up on issues of self-disclosure, when what the patient is often really looking for is a revelation that we have a subjectivity rather than disclosure of personal facts.  And as Aron points out, our patients often pick up on our feelings of resistance or discomfort, and tow the line.  One big problem with this though is that we don’t know what they aren’t telling us about because they didn’t tell us.  In the 60s and 70s there were very few LGBT issues voiced in therapy, and the naive conclusion was that this was because LGBT people and experiences were a minority, in society in general and one’s practice in specific.  Of course, nobody was asking patient’s if they were LGBT, and by not asking communicating their discomfort.

What has this got to do with selfies?  Well for one thing, I think that therapists are often similarly dismissive of technology, and convey this by not asking about it in general.  Over and over I hear the same thing when I present on video games–“none of my patients talk about them.”  When I suggest that they begin asking about them, many therapists have come back to me describing something akin to a dam bursting in the conversation of therapy.  But since we can’t prove a null hypothesis, let me offer another approach to selfies.

All photographs, selfie or otherwise, do not explain anything.  For example:

 

looting

 

People who take a selfie are not explaining themselves, they are acknowledging that they are worth being visible.  Unless you have never experienced any form of oppression this should be self-evident, but in case you grew up absolutely mirrored by a world who thought you were the right size, shape, color, gender, orientation and class I’ll explain:  Many of our patients have at least a sneaking suspicion that they are not people.  They look around the world and see others with the power and prestige and they compare that to the sense of emptiness and invisibility they feel.  Other people can go to parties, get married, work in the sciences, have children, buy houses, etc.  But they don’t see people like themselves prevailing in these areas.  As far as they knew, they were the only biracial kid in elementary school, adoptee in middle school, bisexual in high school, trans person in college, rape survivor at their workplace.

So if they feel that they’re worth a selfie, I join with them in celebrating themselves.

As their therapist I’d even have some questions:

  • What were you thinking and feeling that day you took this?
  • What do you hope this says about you?
  • What do you hope this hides about you?
  • Who have you shared this with?
  • What was their response?
  • What might this selfie tell us about who you are?
  • What might this selfie tell us about who you wish to be?
  • Where does that spark of belief that you are worth seeing reside?

In addition to exploring, patients may find it a useful intervention to keep links to certain selfies which evoke certain self-concept and affect states.  That way, if they need a shift in perspective or affect regulation they can access immediately a powerful visual reminder which says “This is possible for you.”

Human beings choose to represent themselves in a variety of ways, consciously and unconsciously.  They can be whimsical, professional, casual, friendly, provocative, erotic, aggressive, acerbic, delightful.  Are they projections of our idealized self?  Absolutely.  Are they revelatory of our actual self? Probably.  They explain nothing, acknowledge the person who takes them, and celebrate a great deal.  If there is a way you can communicate a willingness see your patient’s selfies you might be surprised at what opens up in the therapy for you both.

 

Melanie Sage, Assistant Professor, University of North Dakota

Melanie Sage, Assistant Professor, University of North Dakota

 

In other posts I have written about Huizinga’s concept of play.  Rather than as seeing selfies as the latest sign that we are going to hell in a narcissistic handbasket, what if we looked at the selfie as a form of play? Selfies invite us in to the play element in the other’s life, they are not “real” life but free and unbounded.  They allow each of us to transcend the ordinary for a moment in time, to celebrate the self, and share with a larger community as a form of infinite game.

It may beyond any of us to live up to the ideal that no one is less interesting than anyone else in our everyday, but seen in this light the selfie is a renunciation of the cynicism I sometimes see by the mental health professionals I meet.  We sometimes seem to privilege despair as somehow more meaningful and true than joy and celebration, but aren’t both essential parts of the human condition?  So if you are a psychotherapist or psychoeducator, heed my words:  The Depth Police aren’t going to come and take your license away, so go out and snap a selfie while everyone is looking.

Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

The Changing Landscape of Social Work

TrekWorld_Nicholas-Roerich_Kanchendzonga-1944

Recently I had the great opportunity to be a scholar-in-residence at The University at Buffalo’s School of Social Work.  For three days I met with students, faculty and staff to speak about emerging technologies ranging from Twitter to video games.  During one morning, Dean Nancy Smyth and I sat down for a series of informal discussions around various topics, and the University was kind enough to let me share these videos with you.  If you want to learn more about how I can come to your institution to do the same thing, please contact me.

How to Use Social Media and Technology to Develop a Personal Learning Network:

 

http://www.youtube.com/watch?v=zb74jYN0k5Y&feature=share&list=UUQG8usDJjq8OjMgtNDQC6fg

 

If I Don’t Use Social Media and Technology in Social Work Practice What Am I Missing?

 

 

Social Work is Changing:  Integrating Social Media and Technology Into Social Work Practice

 

http://youtu.be/FQWUMTxXVus

 

 

Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

Mental Health: Yes, There’s an App for That..

apps2

Nobody wants to be irrelevant, and many mental health practitioners want to try out new technologies like Apps, but how to choose?  Currently the App Store in iTunes makes available 835,440 different Apps, of which approximately 100,000 are categorized as lifestyle, medical or healthcare & fitness.  And Android users have just about as many to choose from according to AppBrain, which says there are a whopping 858870 as of today.  With so many to look at, how can a clinician keep current?  Hopefully we can help each other.

Instead of writing the occasional “Top 10 Post,” I’m setting up a site for you to visit and review different Apps.  I’ll review some too, and hopefully by crowd sourcing we can get a sense of what are some of the best.  I’ll need Android users to weigh in heavy, as I will be test-driving Apple products alone.

Why have I decided to do this?  Several reasons, the complicated one first:

1. Web 2.0 is interactive.  We forget that, even those of us who are trying to stay innovative.  We keep thinking we need to get on the podium and deliver lectures, information, content.  And to a degree that is true, but we can easily slide back to the old model of doing things.  That’s what you see in a lot of our well-intentioned “Top 10 App” posts and articles.  Recently I found myself trying to explain on several occasions why doing a lecture or post on the best Apps for Mental Health didn’t sit right with me.  Part of it was because Apps are put out there so fast, and then surpassed by other apps, that it becomes a bit like Project Runway:  “One day you’re in, the next day you’re out.”

I was getting trapped behind that podium again, until I realized that we don’t need another post about the top 10 mental health apps, we need an interactive platform.  I need to stop acting as if I’m the only one responsible for delivering content, and you need to break out of the mold of passive recipient of information.  I’m sure that many of my colleagues have some suggestions for apps that are great for their practice, and I’m hoping that you all share.  Go to the new site, check out some of the ones I mentioned, and then add your own reviews.  Email me some apps and I’ll try ’em and add them to the site.  Let’s create something much better than a top 10 post with an expiration date, let’s collaborate on a review site together.  Which brings me to:

2.  I want to change the world.  That is the reason I became a social worker, a therapist, and a public speaker.  I think ideas motivate actions, and actions can change the world. The more access people have to products that can improve their mental health, the better.  By creating a site dedicated solely to reviewing mental health applications, we can raise awareness about using emerging technologies for mental health, and help other people improve their lives.  Technology can help us, which brings me to:

3.  Technology can improve our mental health.  Yes, you heard it here.  Not, “we need to be concerned about the ethical problems with technology X,Y or Z.”  “Not, the internet is making us stupid,” or “video games are making people violent,” but rather an alternate vision:  Namely, that emerging technologies can allow more people more access to better mental health.  Let’s start sharing examples of the way technology does that.  There are Apps and other emerging technologies that can help people with Autism, Bipolar, Eating Disorders, Social Phobias, Anxiety, PTSD and many more mental health issues.  I can’t possibly catalog all those alone, so I’m hoping you’ll weigh in and let me know which Apps or tech have helped you with your own struggles.

Is this the new site, Mental Health App Reviews, a finished product?  Absolutely not.  What it will be depends largely on all of us.  This is how crowd sourcing can work.  This is how Web 2.0 can work.

If you want to contribute, just email me at mike@mikelanglois.com with the following:

  • App name
  • Screenshot if possible
  • Price
  • Link to App

and I’ll take it from there.  Please let me know if you are a mental health provider and or the product owner in the email as well.

You can also contribute by reviewing the Apps below that you use.  Be as detailed as possible, we’re counting on you!  And while you’re at it, follow us on Twitter @MHAppReviews

The Internet Is Not A Meritocracy, That’s Why You Hate It

lightbulb

Recently, I had a discussion with a student about social media, and the fact that I usually start off a comment on a blog with “great post!”  She noted two things:  First, that it rang false to her initially, making her wonder if I even read the posts people write; and second, that despite this initial impression she found herself commenting anyway.  So let me define what a great post is.

A great post is one that captures your interest and keeps the thoughtful discourse going.

Now many of my academic readers are going to vehemently disagree.  They may disagree with this blog post entirely, and you know what?  If they comment on it, I’ll publish the comment.  Because the comment keeps the discourse going.

Also recently, I was explaining my pedagogy to colleagues who were questioning my choice to assign a whole-class group assignment for 25% of the student grade.  The concern was that by giving the class a grade as a whole I would run the risk of grade inflation.  This is a real concern for many of my peers in academia and I respect that, and as someone who believes in collaboration I intend to balance advocating for my pedagogical view with integrating the group’s discerning comments and suggestions.  In my blog however, let me share my unbridled opinion on this.

I don’t care about grade inflation.

Really, I don’t.  I went to a graduate school which didn’t have grades, but had plenty of intellectual rigor.  I am more concerned with everyone having a chance to think and discuss than ranking everyone in order.  That is my bias, and that is one reason I like the internet so much.

The old model of education is a meritocracy, which according to OED is:

Government or the holding of power by people chosen on the basis of merit (as opposed to wealth, social class, etc.); a society governed by such people or in which such people hold power; a ruling, powerful, or influential class of educated or able people.

 

I think that Education 2.0 has many of us rethinking this.  Many of our students were indoctrinated into that view of education that is decidedly meritocratic.  I suspect this was part of what was behind my student’s skepticism about “great post!”  My role as an educator in a meritocracy is to evaluate the merit of these comments and ideas, rank them and award high praise only to those which truly deserve it.  By great posting everything I demean student endeavors.

One of my colleagues Katie McKinnis-Dietrich frequently talks about “finding the A in the student.”  This interests me more than the finite game of grading.  Don’t get me wrong, I do offer students choices about how to earn highest marks in our work together, I do require things of them; but I try hard to focus more on the content and discourse rather than grades.

I frequently hear from internet curmudgeons that the internet is dumbing down the conversation.  The internet isn’t dumbing down the conversation:  The internet is widening it.  Just as post-Gutenberg society allowed literacy to become part of the general population, Web 2.0 has allowed more and more human beings to have access to the marketplace of ideas.  We are at an historic point in the marketplace of ideas, where more intellectual wares are being bought and sold.  More discernment is certainly required, but the democratization of the internet has also revealed the internalized academic privilege we often take for granted.  Every ivory tower now has WiFi, and so we can experience more incidents of our sneering at someone’s grammar and picking apart their spelling.  What is revealed is not just the poor grammar and spelling of the other, but our own meritocratic tendencies.

Detractors will pointedly ask me if I would undergo surgery performed by someone who had never been to medical school, and I will readily admit that I will not.  But how can we reconcile that with the story of Jack Andraka, a 15 year-old who with no formal training in medicine created a test for pancreatic cancer that is 100 Times More Sensitive & 26,000 Times Cheaper than Current Tests.  In fact, if you listen to his TED talk, Jack implicitly tells the story of how only one of the many universities he contacted took him seriously enough to help him take this discovery to the next level.  Meritocracy in this case slowed down the process of early intervention with pancreatic cancer.  One side of this story is that this test will save countless lives; the darker side is how many lives were lost because the meritocracy refused to believe that someone who hadn’t been educated in the Scholastic tradition could have a real good idea.

I am urgently concerned with moving education further in the direction of democracy and innovation.  Any post that gets me thinking and interacting thoughtfully with others is a great post.  On a good day I remember this.

But like many academics and therapists and educators and human beings brought up in a meritocracy, I have my bad days.  Like many of you, I fear becoming irrelevant.  I resist change, whether it be the latest iOS or social mores.  Last night I caught myself reprimanding (internally) the guy wearing a baseball cap to dinner in the restaurant I was in.

We still live in a world where only students with “special needs” have individualized education plans– quite frankly, I think that everyone should have an individualized education plan.  I think our days of A’s being important are numbered.  There are too many “A students” unemployed or underemployed, too many untenured professors per slot to give the same level of privilege in our educational meritocracy.  Digital literacy is the new frontier, and I hope our goal is going to be maximizing the human potential of everyone for everyone’s sake.  Yes this is a populist vision, I think the educational “shining city on the hill” needs to be a TARDIS, with room for the inclusion of all.  I also think that those of us who have benefited from scholastic privilege will not give this privilege up easily.  We desperately want to remain relevant.

I know it is risky business putting this out in the world where my colleagues could see it.  I know this will diminish my academic standing in the eyes of many.  I know my students may read it and co-opt my argument to try to persuade me to give the highest grade.  But if I believe in discourse and collaboration I’ll have to endure that and walk the walk.

I’m not saying that every idea is a good one.  What I am saying, what I believe that has changed my life for the better is something I find more humbling and amazing about the human being:  Not every idea is a good one, but anyone, anyone at all, can have a good idea.

Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

Innovation is Dangerous & Gaming Causes Asperger’s

GamerTherapist blog is on vacation and will return with new posts after Labor Day.  In the meantime, here is a reader favorite:

At its heart, diagnosis is about exerting control.  Clinicians want to get some sense of control in understanding a problem.  We link diagnosis to prognosis to control our expectations of how likely and how much we will see a change in the patient’s condition.  Insurance companies want to get a handle on how much to spend on who.  Schools want to control access to resources and organize their student body.  And with the current healthcare situation, the government is sure to use diagnosis as a major part of the criteria in determining who gets what kind of care.

Therapists and Educators do not like to think of ourselves as controlling people.  But we often inadvertently attempt to exert control over our patients and entire segments of the population, by defining something as a problem and then locating it squarely in the individual we are “helping.”

This week has been one of those weeks where I have heard from several different colleagues about workshops they are attending where the presenters are linking Asperger’s with Gaming Addiction:  Not in the sense of “Many people on the Autism Spectrum find success and motivation through the use of video games,” but rather in the sense of “excessive gaming is prevalent in the autistic spectrum community.”

This has always frustrated me, for several reasons, and I decided its time to elaborate on them again:

1. Correlation does not imply Causation.  Although this is basic statistics 101 stuff, therapists and educators continue to make this mistake over and over.  Lots of people with Asperger’s play video games, this is true.  This should not surprise us, because lots of people play video games!  97% of all adolescent boys and 94% of adolescent girls, according to the Pew Research Center.  But we love to make connections, and we love the idea that we are “in the know.”  I can’t tell you how many times when I worked in education and clinics I heard talk of people were “suspected” of having Asperger’s because they liked computers and did not make eye contact.  Really.  If a kiddo didn’t look at the teacher, and liked to spend time on the computer, a suggested diagnosis of Autism couldn’t be far behind.  We like to see patterns in life, even oversimplified ones.

2. Causation often DOES imply bias.  Have you ever stopped to wonder what causes “neurotypical” behavior?  Or what causes heterosexuality for that matter.  Probably not.  We usually try to look for the causation of things we are busily pathologizing in people.  We want everyone to fit within the realm of what the unspoken majority has determined as normal.  Our education system is still prone to be designed like a little factory.  We want to have our desks in rows, our seats assigned, and our tests standardized.  So if your sensory input is a little different, or your neurology atypical, you get “helped.”  Your behavior is labeled as inappropriate if it diverges, and you are taught that you do not have and need to learn social skills.

Educators, parents, therapists and partners of folks on the Austism Spectrum, please repeat this mantra 3 times:

It is not good social skills to tell someone they do not have good social skills.

By the same token, technology, and video games, are not bad or abnormal either.  Don’t you see that it is this consensual attitude that there is something “off” about kids with differences or gamers or geeks that silently telegraphs to school bullies that certain kids are targets?  Yet, when an adolescent has no friends and is bullied it is often considered understandable because they have “poor social skills and spend too much time on the computer.”  Of course, many of the same kids are successfully socializing online through these games, and are active members of guilds where the stuff they hear daily in school is not tolerated on guild chat.

Let’s do a little experiment:  How about I forbid you to go to your book discussion group, poker night, or psychoanalytic institute.  Instead, you need to spend all of your time with the people at work who annoy you, gossip about you and make your life miserable.  Sorry, but it is for your own good.  You need to learn to get along with them, because they are a part of your real life.  You can’t hide in rooms with other weirdos who like talking about things that never happened or happened a long time ago; or hide in rooms with other people that like to spend hours holding little colored pieces of cardboard, sort them, and exchange them with each other for money; or hide in rooms where people interpret dreams and talk about “the family romance.”

I’m sure you get my point.  We have forgotten how little personal power human beings have before they turn 18.  So even if playing video games was a sign of Asperger’s, we need to reconsider our idea that there is something “wrong” with neuro-atypical behaviors.  There isn’t.

A lot of the work I have done with adults on the spectrum has been to help them debrief the trauma of the first 20 years of their lives.  I’ve had several conversations where we’ve realized that they are afraid to ask me or anyone questions about how to do things, because they worried that asking the question was inappropriate or showed poor social skills.  Is that really what you want our children to learn in school and in treatment?  That it is not ok to ask questions?  What a recipe for a life of loneliness and fear!

If you aren’t convinced, please check out this list of famous people with ASD.  They include Actors (Daryl Hannah,) bankers, composers, rock stars, a royal prince and the creator of Pokemon.  Not really surprising when you think about innovation.

3.  Innovation is Dangerous.  Innovation, like art, requires you to want things to be different than the way they are.  Those are the kids that don’t like to do math “that way,” or are seen as weird.  These are the “oversensitive” ones.  These are the ones who spend a lot of time in fantasy, imagining a world that is different.  These are the people I want to have over for hot chocolate and talk to, frankly.

But in our world, innovation is dangerous.  There are unspoken social contracts that support normalcy and bureaucracy (have you been following Congress lately?)  And there are hundreds of our colleagues who are “experts” in trying to get us all marching in lockstep, even if that means killing a different drummer.  When people try to innovate, they are mocked, fired from their jobs, beaten up, put down and ignored.  It takes a great deal of courage to innovate.  The status quo is not neutral, it actively tries to grind those who are different down.

People who are fans of technology, nowadays that means internet and computing, have always been suspect, and treated as different or out of touch with reality.  They spend “too much time on the computer,” we think, until they discover the next cool thing, or crack a code that will help fight HIV.  Only after society sees the value of what they did do they get any slack.

Stop counting the hours your kid is playing video games and start asking them what they are playing and what they like about it.  Stop focusing exclusively on the “poor social skills” of the vulnerable kids and start paying attention to bullies, whether they be playground bullies or experts.  Stop worrying about what causes autism and start worrying about how to make the world a better place for people with it.

Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!