Copyright 2015 NPR. To see more, visit http://www.npr.org/.

Transcript

ARUN RATH, HOST:

Even if you have good coverage, it can be hard to find a mental health provider. And as Jenny Gold just mentioned, a lot of people are still held back from seeking help because of the stigma. Robert Morris thinks social media can help. He created a social networking app called Panoply. Panoply engages a group of patients in cognitive behavioral therapy. It's essentially a way to reframe thoughts and experiences. Morris says that while he was inspired by existing social networking apps, Panoply is unique because it has a specific goal - to treat clinical depression.

ROBERT MORRIS: The intuition is can we somehow leverage that framework, which we know brings eyeballs back to the screen, and somehow, in a very clever way, inject therapeutic content?

RATH: So can you explain how - how Panoply works, how you're able to have users interact with each other in a way that helps them out like that?

MORRIS: One way to describe it is to just go through an example. So if you imagine you got called in for a performance review, let's say for your boss, and you walk into her office, and she looks at you and furrows her brow and looks a little disgusted and says not now and kind of ushers you out of the room. If you're like me and kind of prone to pessimistic, depressive styles of thinking, you might think oh my gosh, my review's going to be terrible. My boss hates me, et cetera, et cetera.

So the idea with this app is you would put that situation and those cognitions in, and then this crowd of people would be trained to systematically help you readjust your thinking back to sort of the real world. So they might do things like identify what we call cognitive distortions, specific bugs in your thinking. So they might say well, you might be mindreading a little bit there. You don't know for sure what your boss is thinking. You might be over-generalizing a bit based on this one interaction how she thinks about you broadly. And the basic idea with the app is to - you know, when you're stressed, it's very hard to think creatively and with poise, so you're leveraging other well-intentioned, well-meaning people to come help you rewrite the story for you.

And then on the flipside - and this is where it actually gets really interesting - is the app is designed so that you can also help other people. So let's say you're another user - you're standing in line waiting for coffee and you see this thought come in about the boss. You may just spend a couple minutes coming up with some ways to help them that is aligned with some of these evidence-based practices. It only takes a few minutes. You help someone out. You might get a thank you note back from the person or an upvote. And it's a really fun, rewarding way to help people, and at the same time practice a skill over and over that could help you in an enduring way when you're not on the app.

RATH: You find that people derive benefit from giving advice as much or even more than from getting it.

MORRIS: Yeah, I think absolutely so. I think it's a great way to rehearse the skills by teaching them to other people repeatedly. I always generally find that the best way I actually ever find to learn something is to teach it to someone else. So on this app, we have a lot of people, you know, chiming in and teaching people these skills. And I think it also brings a sense of self-efficacy, that hey, if I can do this for someone else and they found it really useful, maybe when I'm stressed I can turn it back onto myself.

RATH: I hate to say it, but the first problem I think about is how - how do you screen out the kind of Internet lurkers who might delight in undermining someone's progress?

MORRIS: Yes, the evil on the Internet is always - it's a perennial problem. So there are a couple things right off the bat I can tell you. One thing I did in my experimental study is before any content went out to any of the users, it was screened by three - what we call digital crowdworkers - these people who work on a site called Mechanical Turk. They were all asked just read these three sentences and let me know if there's any abusive content in it, any bullying, so we have a set of eyes looking at everything before it comes through.

And then the final tactic, which was something I learned at this computer science conference, which I thought was amazing - we aren't using it, but I just think it's really interesting - is you can create what's called the misery experience. So it's a very passive-aggressive way of kind of restricting a troll or a bully from your site. You don't ban them outright. You just make the experience terrible for them. So it looks like the app doesn't work - maybe they open it and there's a spinning wheel of doom and it looks like it's loading forever. And the hope there is that eventually they get bored and saunter off and go bother someone else.

RATH: Robert Morris is a designer of interactive technology, is in a freshly-minted Ph.D. from the MIT Media Lab. Robert, thanks very much.

MORRIS: Thank you. Transcript provided by NPR, Copyright NPR.

300x250 Ad

Support quality journalism, like the story above, with your gift right now.

Donate