Mental Health: Yes, There’s an App for That..

apps2

Nobody wants to be irrelevant, and many mental health practitioners want to try out new technologies like Apps, but how to choose?  Currently the App Store in iTunes makes available 835,440 different Apps, of which approximately 100,000 are categorized as lifestyle, medical or healthcare & fitness.  And Android users have just about as many to choose from according to AppBrain, which says there are a whopping 858870 as of today.  With so many to look at, how can a clinician keep current?  Hopefully we can help each other.

Instead of writing the occasional “Top 10 Post,” I’m setting up a site for you to visit and review different Apps.  I’ll review some too, and hopefully by crowd sourcing we can get a sense of what are some of the best.  I’ll need Android users to weigh in heavy, as I will be test-driving Apple products alone.

Why have I decided to do this?  Several reasons, the complicated one first:

1. Web 2.0 is interactive.  We forget that, even those of us who are trying to stay innovative.  We keep thinking we need to get on the podium and deliver lectures, information, content.  And to a degree that is true, but we can easily slide back to the old model of doing things.  That’s what you see in a lot of our well-intentioned “Top 10 App” posts and articles.  Recently I found myself trying to explain on several occasions why doing a lecture or post on the best Apps for Mental Health didn’t sit right with me.  Part of it was because Apps are put out there so fast, and then surpassed by other apps, that it becomes a bit like Project Runway:  “One day you’re in, the next day you’re out.”

I was getting trapped behind that podium again, until I realized that we don’t need another post about the top 10 mental health apps, we need an interactive platform.  I need to stop acting as if I’m the only one responsible for delivering content, and you need to break out of the mold of passive recipient of information.  I’m sure that many of my colleagues have some suggestions for apps that are great for their practice, and I’m hoping that you all share.  Go to the new site, check out some of the ones I mentioned, and then add your own reviews.  Email me some apps and I’ll try ’em and add them to the site.  Let’s create something much better than a top 10 post with an expiration date, let’s collaborate on a review site together.  Which brings me to:

2.  I want to change the world.  That is the reason I became a social worker, a therapist, and a public speaker.  I think ideas motivate actions, and actions can change the world. The more access people have to products that can improve their mental health, the better.  By creating a site dedicated solely to reviewing mental health applications, we can raise awareness about using emerging technologies for mental health, and help other people improve their lives.  Technology can help us, which brings me to:

3.  Technology can improve our mental health.  Yes, you heard it here.  Not, “we need to be concerned about the ethical problems with technology X,Y or Z.”  “Not, the internet is making us stupid,” or “video games are making people violent,” but rather an alternate vision:  Namely, that emerging technologies can allow more people more access to better mental health.  Let’s start sharing examples of the way technology does that.  There are Apps and other emerging technologies that can help people with Autism, Bipolar, Eating Disorders, Social Phobias, Anxiety, PTSD and many more mental health issues.  I can’t possibly catalog all those alone, so I’m hoping you’ll weigh in and let me know which Apps or tech have helped you with your own struggles.

Is this the new site, Mental Health App Reviews, a finished product?  Absolutely not.  What it will be depends largely on all of us.  This is how crowd sourcing can work.  This is how Web 2.0 can work.

If you want to contribute, just email me at mike@mikelanglois.com with the following:

  • App name
  • Screenshot if possible
  • Price
  • Link to App

and I’ll take it from there.  Please let me know if you are a mental health provider and or the product owner in the email as well.

You can also contribute by reviewing the Apps below that you use.  Be as detailed as possible, we’re counting on you!  And while you’re at it, follow us on Twitter @MHAppReviews

The Internet Is Not A Meritocracy, That’s Why You Hate It

lightbulb

Recently, I had a discussion with a student about social media, and the fact that I usually start off a comment on a blog with “great post!”  She noted two things:  First, that it rang false to her initially, making her wonder if I even read the posts people write; and second, that despite this initial impression she found herself commenting anyway.  So let me define what a great post is.

A great post is one that captures your interest and keeps the thoughtful discourse going.

Now many of my academic readers are going to vehemently disagree.  They may disagree with this blog post entirely, and you know what?  If they comment on it, I’ll publish the comment.  Because the comment keeps the discourse going.

Also recently, I was explaining my pedagogy to colleagues who were questioning my choice to assign a whole-class group assignment for 25% of the student grade.  The concern was that by giving the class a grade as a whole I would run the risk of grade inflation.  This is a real concern for many of my peers in academia and I respect that, and as someone who believes in collaboration I intend to balance advocating for my pedagogical view with integrating the group’s discerning comments and suggestions.  In my blog however, let me share my unbridled opinion on this.

I don’t care about grade inflation.

Really, I don’t.  I went to a graduate school which didn’t have grades, but had plenty of intellectual rigor.  I am more concerned with everyone having a chance to think and discuss than ranking everyone in order.  That is my bias, and that is one reason I like the internet so much.

The old model of education is a meritocracy, which according to OED is:

Government or the holding of power by people chosen on the basis of merit (as opposed to wealth, social class, etc.); a society governed by such people or in which such people hold power; a ruling, powerful, or influential class of educated or able people.

 

I think that Education 2.0 has many of us rethinking this.  Many of our students were indoctrinated into that view of education that is decidedly meritocratic.  I suspect this was part of what was behind my student’s skepticism about “great post!”  My role as an educator in a meritocracy is to evaluate the merit of these comments and ideas, rank them and award high praise only to those which truly deserve it.  By great posting everything I demean student endeavors.

One of my colleagues Katie McKinnis-Dietrich frequently talks about “finding the A in the student.”  This interests me more than the finite game of grading.  Don’t get me wrong, I do offer students choices about how to earn highest marks in our work together, I do require things of them; but I try hard to focus more on the content and discourse rather than grades.

I frequently hear from internet curmudgeons that the internet is dumbing down the conversation.  The internet isn’t dumbing down the conversation:  The internet is widening it.  Just as post-Gutenberg society allowed literacy to become part of the general population, Web 2.0 has allowed more and more human beings to have access to the marketplace of ideas.  We are at an historic point in the marketplace of ideas, where more intellectual wares are being bought and sold.  More discernment is certainly required, but the democratization of the internet has also revealed the internalized academic privilege we often take for granted.  Every ivory tower now has WiFi, and so we can experience more incidents of our sneering at someone’s grammar and picking apart their spelling.  What is revealed is not just the poor grammar and spelling of the other, but our own meritocratic tendencies.

Detractors will pointedly ask me if I would undergo surgery performed by someone who had never been to medical school, and I will readily admit that I will not.  But how can we reconcile that with the story of Jack Andraka, a 15 year-old who with no formal training in medicine created a test for pancreatic cancer that is 100 Times More Sensitive & 26,000 Times Cheaper than Current Tests.  In fact, if you listen to his TED talk, Jack implicitly tells the story of how only one of the many universities he contacted took him seriously enough to help him take this discovery to the next level.  Meritocracy in this case slowed down the process of early intervention with pancreatic cancer.  One side of this story is that this test will save countless lives; the darker side is how many lives were lost because the meritocracy refused to believe that someone who hadn’t been educated in the Scholastic tradition could have a real good idea.

I am urgently concerned with moving education further in the direction of democracy and innovation.  Any post that gets me thinking and interacting thoughtfully with others is a great post.  On a good day I remember this.

But like many academics and therapists and educators and human beings brought up in a meritocracy, I have my bad days.  Like many of you, I fear becoming irrelevant.  I resist change, whether it be the latest iOS or social mores.  Last night I caught myself reprimanding (internally) the guy wearing a baseball cap to dinner in the restaurant I was in.

We still live in a world where only students with “special needs” have individualized education plans– quite frankly, I think that everyone should have an individualized education plan.  I think our days of A’s being important are numbered.  There are too many “A students” unemployed or underemployed, too many untenured professors per slot to give the same level of privilege in our educational meritocracy.  Digital literacy is the new frontier, and I hope our goal is going to be maximizing the human potential of everyone for everyone’s sake.  Yes this is a populist vision, I think the educational “shining city on the hill” needs to be a TARDIS, with room for the inclusion of all.  I also think that those of us who have benefited from scholastic privilege will not give this privilege up easily.  We desperately want to remain relevant.

I know it is risky business putting this out in the world where my colleagues could see it.  I know this will diminish my academic standing in the eyes of many.  I know my students may read it and co-opt my argument to try to persuade me to give the highest grade.  But if I believe in discourse and collaboration I’ll have to endure that and walk the walk.

I’m not saying that every idea is a good one.  What I am saying, what I believe that has changed my life for the better is something I find more humbling and amazing about the human being:  Not every idea is a good one, but anyone, anyone at all, can have a good idea.

Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

Innovation is Dangerous & Gaming Causes Asperger’s

GamerTherapist blog is on vacation and will return with new posts after Labor Day.  In the meantime, here is a reader favorite:

At its heart, diagnosis is about exerting control.  Clinicians want to get some sense of control in understanding a problem.  We link diagnosis to prognosis to control our expectations of how likely and how much we will see a change in the patient’s condition.  Insurance companies want to get a handle on how much to spend on who.  Schools want to control access to resources and organize their student body.  And with the current healthcare situation, the government is sure to use diagnosis as a major part of the criteria in determining who gets what kind of care.

Therapists and Educators do not like to think of ourselves as controlling people.  But we often inadvertently attempt to exert control over our patients and entire segments of the population, by defining something as a problem and then locating it squarely in the individual we are “helping.”

This week has been one of those weeks where I have heard from several different colleagues about workshops they are attending where the presenters are linking Asperger’s with Gaming Addiction:  Not in the sense of “Many people on the Autism Spectrum find success and motivation through the use of video games,” but rather in the sense of “excessive gaming is prevalent in the autistic spectrum community.”

This has always frustrated me, for several reasons, and I decided its time to elaborate on them again:

1. Correlation does not imply Causation.  Although this is basic statistics 101 stuff, therapists and educators continue to make this mistake over and over.  Lots of people with Asperger’s play video games, this is true.  This should not surprise us, because lots of people play video games!  97% of all adolescent boys and 94% of adolescent girls, according to the Pew Research Center.  But we love to make connections, and we love the idea that we are “in the know.”  I can’t tell you how many times when I worked in education and clinics I heard talk of people were “suspected” of having Asperger’s because they liked computers and did not make eye contact.  Really.  If a kiddo didn’t look at the teacher, and liked to spend time on the computer, a suggested diagnosis of Autism couldn’t be far behind.  We like to see patterns in life, even oversimplified ones.

2. Causation often DOES imply bias.  Have you ever stopped to wonder what causes “neurotypical” behavior?  Or what causes heterosexuality for that matter.  Probably not.  We usually try to look for the causation of things we are busily pathologizing in people.  We want everyone to fit within the realm of what the unspoken majority has determined as normal.  Our education system is still prone to be designed like a little factory.  We want to have our desks in rows, our seats assigned, and our tests standardized.  So if your sensory input is a little different, or your neurology atypical, you get “helped.”  Your behavior is labeled as inappropriate if it diverges, and you are taught that you do not have and need to learn social skills.

Educators, parents, therapists and partners of folks on the Austism Spectrum, please repeat this mantra 3 times:

It is not good social skills to tell someone they do not have good social skills.

By the same token, technology, and video games, are not bad or abnormal either.  Don’t you see that it is this consensual attitude that there is something “off” about kids with differences or gamers or geeks that silently telegraphs to school bullies that certain kids are targets?  Yet, when an adolescent has no friends and is bullied it is often considered understandable because they have “poor social skills and spend too much time on the computer.”  Of course, many of the same kids are successfully socializing online through these games, and are active members of guilds where the stuff they hear daily in school is not tolerated on guild chat.

Let’s do a little experiment:  How about I forbid you to go to your book discussion group, poker night, or psychoanalytic institute.  Instead, you need to spend all of your time with the people at work who annoy you, gossip about you and make your life miserable.  Sorry, but it is for your own good.  You need to learn to get along with them, because they are a part of your real life.  You can’t hide in rooms with other weirdos who like talking about things that never happened or happened a long time ago; or hide in rooms with other people that like to spend hours holding little colored pieces of cardboard, sort them, and exchange them with each other for money; or hide in rooms where people interpret dreams and talk about “the family romance.”

I’m sure you get my point.  We have forgotten how little personal power human beings have before they turn 18.  So even if playing video games was a sign of Asperger’s, we need to reconsider our idea that there is something “wrong” with neuro-atypical behaviors.  There isn’t.

A lot of the work I have done with adults on the spectrum has been to help them debrief the trauma of the first 20 years of their lives.  I’ve had several conversations where we’ve realized that they are afraid to ask me or anyone questions about how to do things, because they worried that asking the question was inappropriate or showed poor social skills.  Is that really what you want our children to learn in school and in treatment?  That it is not ok to ask questions?  What a recipe for a life of loneliness and fear!

If you aren’t convinced, please check out this list of famous people with ASD.  They include Actors (Daryl Hannah,) bankers, composers, rock stars, a royal prince and the creator of Pokemon.  Not really surprising when you think about innovation.

3.  Innovation is Dangerous.  Innovation, like art, requires you to want things to be different than the way they are.  Those are the kids that don’t like to do math “that way,” or are seen as weird.  These are the “oversensitive” ones.  These are the ones who spend a lot of time in fantasy, imagining a world that is different.  These are the people I want to have over for hot chocolate and talk to, frankly.

But in our world, innovation is dangerous.  There are unspoken social contracts that support normalcy and bureaucracy (have you been following Congress lately?)  And there are hundreds of our colleagues who are “experts” in trying to get us all marching in lockstep, even if that means killing a different drummer.  When people try to innovate, they are mocked, fired from their jobs, beaten up, put down and ignored.  It takes a great deal of courage to innovate.  The status quo is not neutral, it actively tries to grind those who are different down.

People who are fans of technology, nowadays that means internet and computing, have always been suspect, and treated as different or out of touch with reality.  They spend “too much time on the computer,” we think, until they discover the next cool thing, or crack a code that will help fight HIV.  Only after society sees the value of what they did do they get any slack.

Stop counting the hours your kid is playing video games and start asking them what they are playing and what they like about it.  Stop focusing exclusively on the “poor social skills” of the vulnerable kids and start paying attention to bullies, whether they be playground bullies or experts.  Stop worrying about what causes autism and start worrying about how to make the world a better place for people with it.

Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

Dopey About Dopamine: Video Games, Drugs, & Addiction

Last week I was speaking to a colleague whose partner is a gamer. She was telling me about their visit to his mother. During the visit my colleague was speaking to his mother about how much he still enjoys playing video games. His mother expressed how concerned she had been about his playing when he was young. “It could have been worse though,” she’d said, “at least he wasn’t into drugs.”

This comparison is reminiscent of the homophobic one where the tolerant person says, “I don’t mind if you’re gay, as long as you don’t come home with a goat.” The “distinction” made actually implies that the two things are comparable. But in fact they are not.

Our culture uses the word addiction pretty frequently and casually. And gamers and opponents of gaming alike use it in reference to playing video games. Frequently we hear the comments “gaming is like a drug,” or “video games are addictive,” or “I’m addicted to Halo 3.” What muddies the waters further are the dozens of articles that talk about “proof” that video games are addictive, that they cause real changes in the brain, changes just like drugs.

We live in a positivistic age, where something is “real” if it can be shown to be biological in nature. I could argue that biology is only one way of looking at the world, but for a change I thought I’d encourage us to take a look at the idea of gaming as addictive from the point of view of biology, specifically dopamine levels in the brain.

Dopamine levels are associated with the reward center of the brain, and the heightened sense of pleasure that characterizes rewarding experiences. When we experience something pleasurable, our dopamine levels increase. It’s nature’s way of reinforcing behaviors that are often necessary for survival.

One of the frequent pieces of evidence to support video game addiction is studies like this one by Koepp et al, which was done in 1998. It monitored changes in dopamine levels from subjects who were playing a video game. The study noted that dopamine levels increased during game play “at least twofold.” Since then literature reviews and articles with an anti-gaming bias frequently and rightly state that video games can cause dopamine levels to “double” or significantly increase.

They’re absolutely right, video games have been shown to increase dopamine levels by 100% (aka doubling.)

Just like studies have shown that food and sex increase dopamine levels:

This graph shows that eating food often doubles the level of dopamine in the brain, ranging from a spike of 50% to a spike of 100% an hour after eating. Sex is even more noticeable, in that it increases dopamine levels in the brain by 200%.

So, yes, playing video games increases dopamine levels in your brain, just like eating and having sex do, albeit less. But just because something changes your dopamine levels doesn’t mean it is addictive. In fact, we’d be in big trouble if we never had increases in our dopamine levels. Why eat or reproduce when it is just as pleasurable to lie on the rock and bask in the sun?

But here’s the other thing that gets lost in the spin. Not all dopamine level increases are created equal. Let’s take a look at another chart, from the Meth Inside-Out Public Media Service Kit:

This is a case where a picture is worth a thousand words. When we read that something “doubles” it certainly sounds intense, or severe. But an increase of 100% seems rather paltry compare to 350% (cocaine) or 1200% (Meth)!

One last chart for you, again from the NIDA. This one shows the dopamine increases (the pink line) in amphetamine, cocaine, nicotine and morphine:

Of all of these, the drug morphine comes closest to a relatively “low” increase of 100%.

So my point here is twofold:

1. Lots of things, not all or most of them drugs, increase the levels of dopamine.

2. Drugs have a much more marked, sudden, and intense increase in dopamine level increase compared to video games.

Does this mean that people can’t have problem usage of video games? No. But what it does mean, in my opinion, is that we have to stop treating behaviors as if they were controlled substances. Playing video games, watching television, eating, and having sex are behaviors that can all be problematic in certain times and certain contexts. But they are not the same as ingesting drugs, they don’t cause the same level of chemical change in the brain.

And we need to acknowledge that there is a confusion of tongues where the word addiction is involved. Using it in a clinical sense is different than in a lay sense– saying “I’m hooked on meth” is not the same as saying “I’m hooked on phonics.” Therapists and gamers alike need to be more mindful of what they are saying and meaning when they say they are addicted to video games. Do they mean it is a psychological illness, a medical phenomenon? Do they mean they can’t get enough of them, or that they like them a whole lot? Do they mean it is a problem in their life, or are they parroting what someone else has said to them?

I don’t want to oversimplify addiction by reducing it to dopamine level increase. Even in the above discussion I have oversimplified these pieces of “data.” There are several factors, such as time after drug, that we didn’t compare. And there are several other changes in brain chemistry that contribute to rewarding behavior and where it goes awry. I just want to show an example of how research can be cited and misused to distort things. The study we started out with simply found that we can measure changes in brain chemistry which occur when we do certain activities. It was not designed or intended to be proof that video games are dangerous or addictive.

Saying that something changes your brain chemistry shouldn’t become the new morality. Lots of things change your brain chemistry. But as Loretta Laroche says, “a wet towel on the bed is not the same as a mugging.” We need to keep it complicated and not throw words around like “addiction” and “drug” because we want people to take us seriously or agree with us. That isn’t scientific inquiry. That’s hysteria.

Find this video interesting? I can speak in person too:  Check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

 

Guild Wars: The Conservative Attack on Online Therapy

Commercial-routes

“European commerce during the Dark Ages was limited and stifled by the existence of a multitude of small kingdoms that were independently regulated and who suppressed the movement of goods across their borders through a confusing and inconsistent morass of taxation, tariff, and regulation. This forced merchants to find another solution to move their goods, one that would avoid the strangulation that resulted from this cumbersome regulatory model. These merchants chose to move their goods by sea without being subject to the problems that were created by this feudal and archaic design, a move that changed the world. The little kingdoms took hundreds of years to catch up.”

–Harris, E., & Younggren, J. N. Risk management in the digital world.

Keeping up with policy is not my favorite thing:  But if I am to continue to be a consultant to therapists building their business and an educator on integrating technology into social work practice, it is part of the prep work.  So when a recent client asked me a question about licensure and online therapy in our Commonwealth of Massachusetts I surfed on over to our Division of Professional Licensure to take a look.  Good thing I did, and a lesson for all of you thought leaders and innovators out there, regardless of what state you live in.

There wasn’t much about technology, except for the interesting fact that the past several Board Meeting minutes made mention of a Committee discussion open to the public on “E-practice policy.”  I assumed (correctly it turns out) that this meant that the Social Work Board was formulating a policy, so I reached out to the Division and asked some general questions about what it was going to look like.  The answer was prompt and pretty scary.

The representative stated in her email to me that the “Board ​feels ​as ​if ​the ​use ​of ​electronic ​means ​should ​be ​employed ​as ​a ​last ​resort ​out ​of ​absolute ​necessity ​and ​it ​is ​not ​encouraged. ​The ​social ​worker ​would ​have ​the ​burden ​of ​proof ​that ​electronic ​means ​were ​employed ​as ​a ​last ​resort ​out ​of ​absolute ​necessity.”

I have several concerns about this.

Before elaborating on them, I want to explain that my concerns are informed by my experience as a clinical social worker who has used online therapy successfully for several years, as well as an educator nationwide on the thoughtful use of technology and social work practice.  I have had the opportunity to present on this topic at a number of institutions including Harvard Medical School and have created the first graduate course on this topic for social workers at Boston College.  In short, this issue is probably the most defining interest and area of study in my career as a social work clinician, educator and public speaker.

I also am a believer in regulation, which is why I have been licensed by the Board of Licensure in Oregon, and am in process of similar applications in several states, including CA, and NY, so that I may practice legitimately in those jurisdictions. I am a very concerned stakeholder in telemedicine and here are only a few of my concerns about a policy of “extenuating-circumstances-only-and-be-ready-to-prove-it:”

 

  1. E-Therapy is an evidence-based practice.  It has been found to be extremely efficacious in a number of peer-reviewed studies, over 100 of which can be found at  http://construct.haifa.ac.il/~azy/refthrp.htm .  In fact, telemedicine has been found to have comparable efficacy to in-office treatment of eating disorders (Mitchell et al, 2008,) childhood depression (Nelson et al, 2006,) and psychosocial case management of diabetes (Trief et al, 2007) among others.   To limit an efficacious modality of treatment by saying it needs to be used only in an “extenuating” circumstance or as a last resort which is discouraged would be a breathtaking reach and troublesome precedent on the part of the Board, which has not been done with any other treatment modality to the best of my knowledge.  Telemedicine was also endorsed by the World Health Organization 3 years ago.  And as I wrote this post, the University of Zurich released research showing online therapy is as good as traditional face-to-face therapy, and possibly better in some cases (Birgit, 2013.)
  2. To place and require a burden on the individual social worker to account for why this treatment modality is justified by necessity of extenuating circumstances also raises the issues of parity and access.  Providers familiar with the issue of mental health parity will hopefully see the parallels here.  Clinical social workers for example may become more reluctant to work with patients requiring adaptive technology if they realize that they could be held to a higher level of scrutiny and documentation than their counterparts who do not use online technology.  Even though the Board would possibly deem those circumstances “extenuating” it would require an extra layer of process and bureaucracy that could have the side effect of discouraging providers from taking on such patients.
  3. Insurers such as Tricare and the providers in the military are increasingly allowing for reimbursement for telemedicine; and videoconferencing software is  becoming more encrypted and in line with HIPAA.  While these should not be the reasons that drive telemedicine in social work, we should consider that a growing segment of the population finds it a reputable form of service delivery.
  4. Such policies require input from people with expertise in clinical practice, the law,  technology, and the integration of the three.  When I asked about whether any members of the Board had experience with the use of different newer technologies in clinical practice or how to integrate them, I was informed that “the ​Board ​is ​comprised ​of ​members ​with ​diverse ​backgrounds. ​They ​have ​reviewed ​the ​policies ​and ​procedures ​for ​electronic ​means ​for ​many ​other ​jurisdictions ​as ​well ​as ​the ​NASW ​and ​ASWB ​Standards ​for ​Technology ​and ​Social ​Work ​Practice ​in ​addition ​to ​the ​policies ​set ​forth ​for ​Psychologists, ​LMHC’s ​and ​LMFT’s ​in ​MA.”

The NASW policy which I believe she is referring to was drafted 8 years ago in 2005.  For context, it was drafted 5 years before the iPad in 2010, 2 years before the iPhone in 2007, and 4 years before the HITECH act in 2009.  In fact, the policy I reference says nothing about limiting technology such as online therapy to “last resort;” rather it encourages more social workers and their clients to have access to and education about it. That professional organizations may be lagging behind the meaningful use and understanding of technology is not the Board’s fault.  But to rely on those policies in the face of recent and evidence-based research is concerning.  If the Board does wish to be more conservative than innovative in this case, I’d actually encourage it to consider the policy adopted by the Commonwealth’s Board of Allied Mental Health Professionals at http://www.mass.gov/ocabr/licensee/dpl-boards/mh/regulations/board-policies/policy-on-distance-online-and-other.html which in fact does not make any mention of setting a criteria of extenuating circumstances or potentially intimidate providers with the requirement of justification.

I hope the Board listens to my concerns and input of research and experience in the respectful spirit that it is intended. I am aware that I am commenting on a policy that I have not even seen, and I am sure that the discussions have been deep and thoughtful, but I know we can do better.  As a lifetime resident of Massachusetts, I know we take pride in being forward thinkers in public policy.  Usually we set the standard that other states adopt rather than follow them.  I invited the Board to call upon me at any time to assist in helping further the development of this policy, and reached out to state and national NASW as well.  I hope they take me up on it, but I am not too hopeful.  I had to step down from my last elected NASW position because I refused to remove or change past or future blog posts.

If you practice clinical social work or psychotherapy online, it’s 3:00 AM:  Do you know what your licensing boards and professional organizations are doing?  Are they crafting policies which are evidence-based and value-neutral about technology, or are they drafting policies based on the feelings and opinions of a few who may not even use technology professionally?

This is a big deal, and you need to be involved, especially if you are pro-technology.  The research from Pew Internet Research shows that people age 50-64 use the internet 83% of the time, about 10% less than younger people; and only 56% of people 65 or older do. These older people and digital immigrants are often also the decision-makers who are involved in policy-making and committees.

If you don’t want to practice online, you may bristle at this post.  Am I saying that older people are irrelevant? No.  Am I saying that traditional psychotherapy in an office is obsolete? Absolutely not.  But I am saying that there is a backlash against technology from people who are defensive and scared of becoming irrelevant, and fear does not shape the best policy.  Those of us with experience in social justice activism know that sometimes we need to invite ourselves to the party if we want a place at the table.

And with government the table is often concealed behind bureaucracy and pre-digital “we posted notice of this public hearing in the lobby of the State House” protocols.  My local government is relatively ahead of the curve by posting minutes online, but I look forward to the day when things are disseminated more digitally, and open to the public means more than showing up at 9:30 AM on a work day.  If they allow videoconferencing or teleconferencing I will gladly retract that.

At its heart, divisions of professional licensure are largely about guildcraft:  They regulate quality for the good of the whole guild and the consumers who purchase services from guild members.  They establish policies and sanction members of the guild as part of establishing and maintaining the imprimatur of “professional” for the entire guild.  They develop criteria both to assure quality of services and to regulate the number of providers allowed in the guild with a certain level of privileges at any time:  LSWs, LCSWs, and LICSWs are the modern-day versions of Apprenctice, Journeyman and Master Craftsman.  This is not to say guilds are bad, but it is to say that we need more of the senior members of the guild to advocate for technology if they are using it.

Too often the terms “technology” and “online therapy” get attached to term “ethics” in a way that implies that using technology is dangerous if not inherently unethical.  That’s what I see behind the idea that online therapy should only be used as a “last resort.”  We thought something similar about fire once:  It was mysterious to us, powerful and scary.  So were books, reading and writing at one point:  If you knew how to use them you were a monk or a witch.

Technology has always been daunting to the keepers of the status quo, which is why you need to start talking to your policymakers.  Find out what your licensing boards are up to, advocate, give them a copy of this post.  Just please do something, or you may find your practice shaped in a way that is detrimental to your patients and yourself.

Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

References

Birgit, W., Horn, A. B., & Andreas, M. (2013). Internet-based versus face-to-face cognitive-behavioral intervention for depression: A randomized controlled non-inferiority trial. Journal Of Affective Disorders, doi:10.1016/j.jad.2013.06.032

Funderburk, B. W., Ware, L. M., Altshuler, E., & Chaffin, M. (2008). Use and feasibility of telemedicine technology in the dissemination of parent-child interaction therapy. Child Maltreatment, 13(4), 377-382.

Harris, E., & Younggren, J. N. (2011). Risk management in the digital world. Professional Psychology: Research And Practice42(6), 412-418. doi:10.1037/a0025139

Mitchell, J. E., Crosby, R. D., Wonderlich, S. A., Crow, S., Lancaster, K., Simonich, H., et al. (2008). A randomized trial comparing the efficacy of cognitive–behavioral therapy for bulimia nervosa delivered via telemedicine versus face-to-face. Behaviour Research & Therapy, 46(5), 581-592.

Nelson, E., Barnard, M., & Cain, S. (2006). Feasibility of telemedicine intervention for childhood depression Routledge.

Trief, P. M., Teresi, J. A., Izquierdo, R., Morin, P. C., Goland, R., Field, L., et al. (2007). Psychosocial outcomes of telemedicine case management for elderly patients with diabetes. Diabetes Care, 30(5), 1266-1268.

Automation

papertowel-dispenser

Recently I was washing my hands in a public restroom.  The paper towel dispenser was one of those that automatically dispense.  There was a towel ready to be pulled off; you took it, and the dispenser automatically pushed another towel out for the next user.

I was in the middle of taking my fourth towel when I realized that my hands were long-since dry and that I was taking the towels continuously because the dispenser was offering them to me.

Technology offers itself to us, but technology doesn’t decide whether or not we should use it.  That is and always has been a human decision.  We can forget that, or ignore it, but we do so at our peril.

If the towel dispenser was one of the motion detected sort, the above story would never have happened to me, because I would have always had to exercise my agency to get it going.  Ironically that was what the Greeks meant when they first used the term automatos or αὐτόματος:  It came from , autos (self) and méntis (thought) and meant “self-moving, moving of oneself, self-acting, spontaneous” according to Wiktionary.  It wasn’t until the late 1940s when the term automation became more widely used, by General Motors in reference to their new Automation Department.

Although my towel story might be funny to some (it was to me,) it has some serious implications when we think about social media and digital literacy, in particular for our children.  Let’s take this example:

status updates

One of things that has created a confusion of tongues in social media is the fact that we are bombarded with opportunities to share regardless of what the implication might be if we do.  The Facebook status update box is a great example:  As someone I know once said, “they gave us the box, but they didn’t tell us what to with it.”

What is your status update? Is it how you are feeling?  What you are eating?  What you are doing/thinking/talking about?  If the box tells you to write something in it, do you have to?  If you are not feeling happy, sad or tired, do you leave it blank.  And what if you aren’t grateful for something right now?  The status update can be seen as akin to the towel dispenser:  pushing out prompts for you to think or communicate a certain way, but not telling you how or even that you have the choice to refrain from doing so as well.

In the 21st century, to educate our children and adolescents about personal responsibility and agency is to educate them in digital literacy.  This is the responsibility of adults who themselves were raised in a culture that never trained them how to deal with the increasing automation of society or the way social media has changed our brain, sense of self and the social milieu.

It may not come as a surprise that I have strong opinions about this, and they come in a large part from my training as a clinical social worker.  I believe that social workers have a responsibility to help their clients achieve and improve their digital literacy.  In general if you are a mental health provider I think it is your job to do this.  We are tasked with helping the human being in the social environment, and technology is part of the social environment for the majority of the population we serve.  If you do not know how to use Facebook then you are insufficiently educated to work with families and children in the 21st century.  If you are unaware of geotagging and the risk it poses to domestic violence victims seeking safety from their perpetrators you are putting your clients in jeopardy.  If you are an LGBT-affirmative therapist and you don’t know about Grindr you won’t be effective.  If you are a psychotherapist and you don’t ask about your patients’ use of social media you are missing out on a significant part of their daily interactions, behaviors, thoughts and feelings.

Chances are that if you are reading this blog you are not one of my colleagues who is completely averse to technology, so I hope that you’ll pass on some of this info to your colleagues who are.  To the best of my knowledge there are only two graduate courses that teach social workers about the impact of technology on our clients, and I’m teaching them.  This will have to change if we are to remain relevant to the populations we serve.

Technology is offering itself to our clients every day in hundred of ways.  It is up to us to remind them to pause and remember that they have agency.  If we don’t, then we are the ones who have become the machine.

 

Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

Saving Ideas

cave painting

Sometime, over 40,000 years ago, someone decided to put images of human hands on the cave pictured above.  It turned out to be a good idea.  This painting has given scientists information on life in the Upper Paleolithic, raised questions about the capacity of Neandrathal man to create art, and sparked debate about which species in the homo genus created it.  Other later cave paintings depict other ideas: Bulls, horses, rhinoceros, people.

I wasn’t there in the Paleotlithic but I doubt that the images we are seeing in caves were the first ones ever drawn.  I imagine that drawing images in sand and other less permanent media happened.  I suspect that the only reason we have cave paintings is because at some point somebody decided they wanted to be able to save their idea, to keep it longer or perhaps forever.

Every day, 7 billion of us have untold numbers of ideas.  So what makes a person decide that an idea is worth saving?  What makes us pause and make a note in our Evernote App or Moleskine journal?  What inspires us to make a video of our idea on YouTube or write a book?  We can’t always be sure that an idea is a “good” one or even what the criteria for a good idea is.  It usually comes down to belief.

In the past several centuries, the ability to save ideas was relegated to the few who were deemed skillful or divinely inspired.  Books were written in monasteries, then disseminated by printing presses, and as ideas became easier to save, more people saved them.  But, and this is very important, saving an idea doesn’t make it a good idea, just a saved one.  Somewhere along the line we began to get the notion that only a few select people were capable of having a good idea, because only a few select people were capable of saving them.  Even in the 21st century, many mental health professionals and educators cling to the notion that peer-reviewed work published in journals is the apex of quality.  If it is written, if it was saved by a select few it must be a good idea.  If you have any doubt of what I’m talking about just Google “DSM V.”

With each leap in human technology comes the power to save more ideas and then spread them.  People who talk about things going viral often forget that an idea has to be saved first, and that in essence something going viral is really a form of society saving an idea.  If anything, technology has improved the democratization of education and ideas.

This makes many of us who grew up in an earlier era nervous and frustrated.  We call the younger generation self-absorbed rather than democratizing.  We grumble, “what makes you think you should blog about your day, take photos of your food, post links to cute kitten videos?”  We may even take smug self-satisfaction that we aren’t contributing to the static.  I think that’s a bad idea, although it clearly has been saved from earlier times.

40,000 years from now, our ideas may take on meanings we never anticipated, like cave drawings.  Why were kittens so important to them?  In the long view I think we remember that people have to believe they have an good idea before they take the leap of faith to save it.  The citizens of the future may debate who saved kitten videos and why, but it will be taken as given that they must have been important to many of us.

What if everyone had the confidence to believe that they had an idea worth saving?  What if everyone had the willingness to believe that it just might be possible that their idea was brilliant?  Each semester I ask the students in my class to raise their hand if they think they can get an A- or higher in the class, and most do.  Then I ask them to raise their hand if they think they can come up with in an idea in this class that could change the world.  I’ve never had more than 3 hands go up.  That’s sad.

This is why I admire the millennials and older groups who take advantage of social media and put their ideas out there.  I doubt that they are all good ideas, but I celebrate the implicit faith it takes to save them.  Anyone, absolutely anyone at all, can have a good idea.  It may not get recognized or appreciated, but now more than ever it can get saved.  Saving an idea is an act of agency.  It is a political act.  Saving an idea is choosing to become just a bit more visible.  On the most basic level saving an idea is a celebration and affirmation of the self.  Think about that, and dare to jot down, draw, record or otherwise save one of your ideas today.  I just did and it feels great.  Then maybe you can even share it with someone else.

What makes a person decide an idea is worth saving?

You do.

 

Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

Bad Object Rising: How We Learn to Hate Our Educated Selves

Recently I had the opportunity to work with a great set of educators in a daylong seminar.  One of the things I do with teachers when I present is have them play Minecraft.  In this case I started off by giving a general presentation that ended with a story of auto-didactism in an Ethiopian village, where 20 children who had never seen the printed word were given tablets and taught themselves to read.  I did this in part to frame the pedagogy for what came next:  I had them turn on Minecraft and spend 30 minutes exploring the game without any instruction other than getting them networked.

The responses were as varied as the instructors, but one response fascinated me in particular.  Midway into the 30 minutes, one teacher stopped playing the game and started checking her email.  Later, when we returned to our group to have a discussion about the thoughts and feelings that came up around game play, this same teacher spoke up.  We were discussing the idea of playfulness in learning when she said , “you know, I hear a lot about games and learning, and making learning fun; but sometimes learning isn’t fun and you have to do it anyway.  Sometimes you just have to suck it up and do the work.”

“I’m not saying that I disagree with you entirely,”  I said.  “But then how do we account for your giving up on Minecraft and starting to check your email?”

She looked a little surprised, and after a moment’s reflection said, “fair enough.”

I use this example because these are educators who are extremely dedicated to teaching their students, and very academically educated themselves.  Academia has this way, though, of seeping into your mind and convincing you that academics and education are one and the same.  They’re not.

I worked in the field of Special Education for more than a decade from the inside of it, and one of the things I came to believe is that there are no unteachable students.  That is the good news and the bad news.  Bad news because if a student was truly unteachable, they wouldn’t learn from us that they are dumb or bad if they don’t demonstrate the academic achievement we expect.  I remember the youth I worked with calling each other “SPED monkeys” as an insult; clearly they learned that from somewhere and someone.  They had learned to hate themselves as a bad object, in object relations terms, or to project that badness onto other students.  They learned this from the adults around them, from the microaggressions of hatred they experienced every day:  By hate I’ll go with Merriam as close enough, “intense hostility and aversion usually deriving from fear, anger, or sense of injury.”

We tend to mistakenly equate hatred with rage and physical violence, but I suggest that this is because we want to set hatred itself up as hated by and other from ourselves; surely we never behave that way.  But hatred is not always garbed in extremis.  Hatred appears every day to students who don’t fit the academic mold.  Hatred yells “speak English!” to the 6 year olds getting off the bus chatting in Spanish.  Hatred shakes its head barely (but nevertheless) perceptibly before moving on to the next student when the first has fallen silent in their struggle.  Hatred identifies the problem student in the class and bears down on her, saying proudly, “I don’t coddle my students.”  And Hatred shrugs his shoulders when the student has been absent for 3 weeks, and waits for them to be dropped from the rolls.

I’m not sure how I came to see this, because I was one of the privileged academically.  I got straight A’s, achieved academic awards and scholarships that lifted me into an upperclass world and peer group.  I wrote papers seemingly without effort, read for pleasure, and was excited to get 3 more years of graduate school.  And I have had the opportunity to become an educator and an academic myself, having taught college and graduate students.  I could have stayed quiet and siloed in my area of expertise, but work with differently-abled learners taught me something different.  It taught me that people learn to dislike education, shortly after academia learns to dislike them.

Perhaps one of the best literary portrayals of  adult hatred of divergent thinkers comes from the movie Matilda:

“Listen, you little wise acre. I’m smart, you’re dumb; I’m big, you’re little; I’m right, you’re wrong. And there’s nothing you can do about it.”

Nowadays I teach in a much different way than I did early on, before I flipped my classrooms and facilitated guided learning experiences rather than encourage people to memorize me and ideas that I had memorized from others.  And I struggle with this new approach, because I enjoy it so much I feel guilty.  You see, I have internalized the bad object too.  Even with my good grades I internalized it.  And any time I start to depart from the traditional mold of the educated self, I experience a moment of blindness, then a stony silence that seems to say, “you’re being lazy, you should make them a powerpoint and prepare a lecture.”  Yet, if my evaluations on the whole and student and colleague testimonies have truth to them, I am a “good” educator.  So let’s say I am a “good” educator, and if I as a good educator struggle with this, we shouldn’t assume that people that struggle with these issues are “bad” educators.

In fact, when it comes to emerging technologies like social media and video games, educators often try to avoid them, if not because they are fun and suspect, then because educators risk experiencing themselves as the bad object: Who wants to experience themselves as hopelessly dumb, clumsy or lazy when they can experience themselves as the bountiful and perfectly cited fount of all wisdom?  Truth is, both are distorted images of the educated self.

Don’t forget that educators themselves experience tons of societal hatred.  For them it often comes in the guise of curriculum requirements or linking their performance to outcomes on standardized testing.  Hatred comes in the low salaries and the perception that people doing intellectual or emotional labor aren’t really working.  All of this helps educators to internalize a bad object which feels shaming and awful; is it any wonder that we sometimes unconsciously try to get that bad object away from ourselves and locate it in the student?

The good news as I said before is that we are all teachable.  We can learn to make conscious and make sense of the internalized bad object representations.  We can see that thinking of people in terms of smart or dumb is a form of splitting.

And yes, there’s a lot we can do about it.

Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

 

Empathy (Re)Training

 

2013.06.08.01.16.34

Last night I was mining Gold Omber in the asteroid belt near Erindur VI, and I can’t begin to tell you what an accomplishment that was. (This post is not just about spaceships, but about pacifists, Ethiopia and education, so non-gaming educators and therapists keep reading.)  OK, let me tell you why it was an accomplishment.  I am talking about playing the MMO EVE Online, which involves piloting spaceships across vast amounts of space in order to mine, trade, build or pirate among other things.  In essence, your spaceship is your character, with the ship’s parts being the equivalent of your armor and weapons in games like World of Warcraft.  But you can only build or use these parts as your pilot acquires skills, ranging from engineering to planetology to cybernetics, so in that way the player’s pilot is the character in the game.  But at the start of the game you’re told that the pilot is actually a clone (this becomes important later on) and as someone was explaining to me last night the whole cloning thing has its own complications once you start using implants to modify individual clones, which you can only do after you’ve trained the skill of Cybernetics.  And why all that is important is because once you use implants you can learn skills more quickly.

If you think that is confusing, try learning how to use the sprawling user interface or UI, which one of my friends says “was made by demons who hate people, hate their hopes and dreams. Know that you are playing with toys made by demons for their amusement and tread lightly.”  Another way of putting it is that you have keep trying to remember what window you opened in the game to do what, and often have multiple windows open simultaneously in order to figure out what you’re doing or buying or training.  There is a robot tutorial program in the game that helps somewhat, but the whole thing is very frustrating and intriguing for the first several hours of game play.  During this time I was ganked repeatedly, lost lots of loot and ore I had mined, as well as a nice spaceship or two.  So to get to the point where I had learned enough skills to be able to warp halfway across the galaxy, lock onto an asteroid, orbit and mine it while defending myself from marauders was extremely exciting.  I was only able to do this because my above-mentioned friend had given me a much bigger and safer ship than I had started out with, as well as lots of instructions on how to do things; and because I was chatting with people in the game who offered great tips.  Of course one of those people then clicked on my profile in chat to then locate me and gank me again (bye-bye nice ship,) but the knowledge is mine to keep.

By now you may be asking “what has any of this got to do with psychotherapy, social work or education?” so I’ll explain.  I had tried EVE months ago, and given up after about a week of on and off attempts, but this past month I have begun teaching an online course for college educators and MSWs about integrating technology into psychotherapy and education.  One of the required exercises in the course is for the students to get a trial account of World of Warcraft and level a character to 20.  There has been a lot of good-natured reluctance and resistance to doing this, in this class and others:  I have been asked to justify this course material in a way I have never had to justify other learning materials to students.   This included several objecting to playing the game because of violent content prior to playing it much or at all.  It’s as if people were not initially able to perceive the course material of World of Warcraft as being in the same oeuvre as required readings or videos.  It is one thing to bring up in your English literature class that you found the violence in “Ivanhoe” or the sex in “The Monk” objectionable after reading it, but I’ve not heard of cases where students have refused to read these books for class based on those objections.  So I was curious, what made video games so different in people’s minds?

Things became easier for several folks after I set up times to meet them in the game world, and help them learn and play through the first few quests.  As I chatted with them and tried to explain the basic game mechanics I realized that I had learned to take for granted certain knowledge and skills, such as running, jumping, and clicking on characters to speak with them.  I started to suspect that the resistance to playing these games was perhaps connected to the tremendous amount of learning that was having to go on in order to even begin to play the game.  In literacy education circles we would call this  learning pre-readiness skills.  Being thrown into a learning environment in front of peers and your instructor was unsettling, immediate, and potentially embarrassing.  And I think being educators may have actually made this even harder.  Education in the dominant paradigm of the 20th and 21st century seeks to create literary critics and professors as the ultimate outcome of education, according to Ken Robinson.  So here are a group of people who have excelled at reading and writing suddenly being asked to learn and develop an entirely new and different skill set within the framework of a college course:  Of course they were frustrated.

So I started playing EVE again not just to have fun, but to have a little refresher course in empathy.  I have leveled to 90 in WoW, so I know how to do things there, and had begun to forget how frustrating and bewildering learning new games can be.  In EVE I have been clueless and failing repeatedly, and getting in touch with how frustrating that learning curve can be.  I have also been re-experiencing how thrilling it is the first time I make a connection between too concepts or actions in the game:  When I realized that there was a difference between my “Assets” and my “Inventory” I wanted to shout it from the rooftops.  I have begun to see and help my students reflect on similar “learning rushes” when they get them as well.  They are now , in short, rocking the house in Azeroth.

We forget how thrilling and confusing it can be to learn sometimes, especially to the large population on the planet that doesn’t necessarily want to be a college professor or psychotherapist.  We forget that our patients and students are asked to master these frustrations and resistances every day with little notice or credit.

There is a village in Ethiopia, where 20 children were given Xoom tablet computers last year.  The researchers/founders of One Laptop Per Child dropped them off in boxes to these children, who had never learned to read or write.  They were offered no instruction and the only restriction placed on the tablet was to disable the camera.  Within minutes the children had opened the box and learned how to turn the computers on; within weeks they were learning their ABCs and writing; and within months they had learned how to hack into the tablet and turn the camera back on, all without teachers.  This story inspires and terrifies many.  It is inspiring in that it tells the story of what children can learn if they are allowed to be experimental and playful.  It is terrifying because if all this was done without a teacher to lecture or a therapist to raise self-esteem, it raises the question “do we still need them?”

Having played EVE, and taught academics in World of Warcraft, let me assure you that the world still needs teachers and therapists.  But the world needs us to begin to learn how to teach and help in a different way.  If EVE had nothing but online tutorials I would have probably struggled more and given up.  I needed to remain social and related to ask for help, listen to tips, and get the occasional leg up.  We need to retrain ourselves in empathic attunement by going to the places that scare or frustrate us, even if those places are video games.  The relationship is still important; to inspire, encourage and enjoy when learning happens in its myriad forms.  But we need to remember that there are many literacies and that not all human beings aspire to teach an infinitely recurring scholasticism to others.  We need to remember how embarrassing it can be to “not get it,” and how the people we work with every day are heroic that they can continue to show up to live and be educated in a system that humiliates them.

What’s exciting and promising, though, is this simple fact:  Learning is happening everywhere, all the time!  Whether it’s a village in Ethiopia, Elwynn Forest in Azeroth, or in orbit around Erindur VI; learning is happening.  Across worlds real and imagined, rich and poor, learning IS happening.

And we get to keep all the knowledge we find.

 

Like this post? I can speak in person too, check out the Press Kit for Public Speaking info. And, for only $4.99 you can buy my book. You can also Subscribe to the Epic Newsletter!

Want To Help Stop Youth Cyberbullying? Let Your Kids Raid More.

OnyxiaBreath.12.8.06

The above title is misleading.  In fact it is as misleading as the term cyberbullying, which is an umbrella term used from experiences which range drastically.  “Cyberbullying” has been used to describe the humiliation of LGBT youth via video; the racial hatred of Sikhs on Reddit, the systematic harassment and suicide of a teenage girl by a neighboring peer’s mother; a hoax wherein a Facebooker pretended to be a woman’s missing (for 31 years); and the bad Yelp reviews of a restauranteur in AZ.

Wait, huh?

My point, exactly:  All of the things described above are different in scope, intentionality, form of media used, duration, and impact.  We need to keep this complicated.  This is not to take away from the horrific acts that people  have perpetuated with social media, or excuse them.  Rather I think we need to help kids and their parents find more nuanced ways to make sense of the way newer technologies are impacting us.

Social media amplifies ideas, feelings, and conflicts.  It often dysregulates family systems.   Growing up, many family members didn’t need to learn the level of digital literacy that today’s world requires.  Parents may be tempted to put their children in a lengthy or permanent internet lockdown.  I hear the threats, or read them, all the time:  No screens.  You’re unplugged.  She’s grounded from Facebook.

Please don’t do that.

I’ve worked with a number of young adults who have had the experiences of being on the receiving end of hatred, stalking, harassment and intrusion delivered via the internet.  And thank goodness that their parents didn’t unplug them as kids.  Because they stayed online they got to:

  • learn how to ignore haters
  • see/hear others stand up for them in a social media setting
  • come to the defense of a peer themselves
  • increase their ability to meet verbal aggression with cognition
  • make the hundreds of microdecisions about whether to “fight this battle”
  • seek out support from other peers
  • form strong online communities and followings that helped them cope with and marginalize the aggressors

More and more, online technologies are becoming a prevalent form of communication, and to take away access is to remove the hearing and voice of youth.  To do this is disempowerment, not protection.

I’ve said before that parents need to take an engaged approach with kids in order to be there to help kids understand and process the conflicts that are communicated through and amplified by social media.  But this time I want to go further, and suggest that one way to help kids achieve digital literacy in terms of social skills is to let them play more multiplayer video games.

Many of you probably saw that coming, but for those of you who didn’t, let me explain.  21st century video games are themselves a powerful form of social media.  Multiplayer games allow individuals to band together as guilds, raids, platoons and other groups to achieve higher endgame goals.  Collaboration is built into them as part of the fun and as necessary to meet the challenges.

There are exceptions to this, but it has been my experience that people don’t begin systematic personal attacks on each other when they are in the middle of downing Onyxia.  They are too busy joining forces to win.  I am convinced that much hatred we see in the developed world is there in large part because of boredom and apathy.  Games provide an alternative form of engagement to hatin’

Look, I’m not saying that people playing games never say sexist things, swear, or utter homophobic comments.  But I can say that I have heard more people, adults and children, stand up to hatred in World of Warcraft than I ever have in the 2 decades I worked in public school settings.  I’ve seen racism confronted numerous times in guild chat, seen rules for civility created and enforced over and over, always citing a variation of  the same reason:  “We’re all here to have fun, so please keep the climate conducive to that.”

Video games provide powerful interactive arenas for diverse groups of people to collaborate or compete strategically.  They capture our interest with a different sort of drama than the sort that we see our youth struggle with in other settings.  In fact, for many individuals video games provide a welcome respite from the drama that occurs in those other settings.

Social media does indeed amplify nastiness, harassment and hatred.  It also amplifies kindness, hope, generosity and cooperation.  If we don’t lean into social media with our kids, they’ll never know how to use it to amplify goodness in the world.  Worse yet, if we cut them off from connecting with the world online we’ll deprive them of the necessary opportunities to recognize and choose between good and evil.

Like this post?  I can speak in person too, check out the Press Kit for Public Speaking info.  And, for only $4.99 you can buy my book.  You can also  Subscribe to the Epic Newsletter!