Perhaps, you're among the more than 36 million people that have watched social psychologist and Harvard professor Amy Cuddy's TED talk about "power poses" — poses she says would make people feel more powerful and more willing to take risks. For example, poses like leaning back in your chair or standing with your hands on your hips.

Cuddy, a body language expert, has built much of her career on "power poses." In addition to a myriad of speaking engagements, she's been widely featured in the media including the Wall Street Journal, New York Times and CNN discussing the benefits of and science behind "power posing."

The idea of "power poses" came from a 2010 Psychological Science study co-authored by Cuddy and two then-Columbia University professors, Andy Yap and Dana Carney.

But last Sunday, Carney dismissed everything that Cuddy has been teaching about "power poses." Now an associate professor at University of California, Berkeley, Carney posted on her faculty website that she has no "faith in the embodied effects of power poses."

"As evidence has come in over these past 2+ years, my views have updated to reflect the evidence. As such, I do not believe that 'power pose' effects are real," her post said.

Carney based her revised analysis on "failed replications" of the data and her hindsight view that she and fellow researchers engaged in some aspects of data dredging.

It's almost unprecedented in the psychological-research field for one to make such an admission and as boldly Carney.

NPR reached out to Cuddy for a response to Carney's claims and a representative from her publisher. Nicole Dewey of Little, Brown and Company, issued a reply via email. Cuddy expressed surprise at Carney's post and noted that since her first published evidence, the "power posing" effect has been replicated in several other published studies from different labs.

She said, "I have confidence in the effects of expansive postures on people's feelings of power — and that feeling powerful is a critical psychological variable. You can read Carney's full response at the end of this article.

NPR's Simon Scott spoke with Uri Simonsohn, a behavioral scientist at the Wharton School of Business, about the "power poses" research, the implications of Caddy's actions in the scientific research community and about whether we should take similar studies seriously.


Interview Highlights Contain Some Web Only Content

On what the initial study suggested and where it went wrong

Like much of science, early findings are tentative and should be followed up. That's at one level and at the other level, psychology has been changing a lot in the last five, six years. People have been doing more rigorous, more large sample studies, more careful descriptions. And this study was conducted before all the change was happening and so the standards were lower; at least as seen from the current perspective.

On if the researchers reached these conclusions to give a TED Talk

I don't think so. I can't imagine they were thinking about a TED Talk at the time. And so, I think it's important to distinguish between the regional, narrow, a little dry paper that came out in 2010, that Dana's talking about and then the TED Talk, which is inspired by that study but also talks a lot by Amy Cuddy's personal experiences. I think of that talk of more a motivational speaker, more than a scientific presentation.

On if Cuddy's motivational talk was grounded with scientific research

I think so. So maybe the scientific foundation was exaggerated or over perceived. But I think it's useful to keep those two separate, so Dana Carney and Amy Cuddy have thought of this from different perspectives; one much more basic science, the other much more in terms of giving advice.

On if an alarm bell went off when he first heard about "power poses"

Not particularly. I wasn't aware of it until maybe 2003/2004. The alarm's bell for me came up recently, maybe a year or so ago when a team of researchers in Switzerland failed to replicate it. And then the original authors summarized the entire body of work that had looked at this and me and some co-authors, we have a tool – have a physical tool – that will also tell you if a body of work has what you call, "evidential value."

If you should trust it, if there's reason to believe as whole, it's telling you what you think it's telling you. And when we apply that technique to the entire body of work on power posing, we found that no, that there's no reason to believe that and you should be quite skeptical of the existing evidence. That doesn't mean that effect couldn't be documented in the future, it just means the existing evidence for it doesn't have the value that you wish it.

If in hindsight, the research was flawed by inadequate design from the get-go

If you think about the time, the only thing I'd say is, it had a small sample. But studies back then – and it's funny because it's only six years ago, but a lot has changed – it was very common to have a very small sample. And Dana also talks about choices they made when they were reporting that in hindsight seem like that's not the ideal way of doing things.

Imagine you're doing a journalistic story and you want to prove a point. So you ask one person a question and you don't quite like the answer, so you say, "I'm going to ask five people, best out of five." So if after five people you don't get it, you say "Best out of 10." And once you have 10 people, if the answer looks good, then you stop.

And so that kind of thing used to be quite common in psychology. It actually is – to the surprise of some – it's also common out of branches of science and biology. It's not uncommon for people to run, say five miles and if things don't look that promising, they collect another five miles. And so, those kinds of things that seem innocent end up messing up the statistical analysis.

On if we should we take a clue from this when we read other studies

I think, by the time we're communicating with the media, it should be very well established – to the general public, I mean. So I think textbooks would be better than the most recent study. And if we're covering the most recent study, we should realize, these are ideas, these are prototypes.

So we should think about when companies said; the way we're talking now about self-driving cars, it's promising, we think it's going to happen, but we're not reporting that the streets are filled with them right now. And a lot of time when we have this prototype of ideas and we have the first of two studies, the media and people like to think of them as, "This is done." ... And we're really just starting with them.

Just think about it, if you go back and you see the technology section of any newspaper, I'm going to guess that a small share of the things that are in the future ever become the present. The journals in science are a bit like that. They are very well informed guesses about what's going to happen next. But they're not solid knowledge that we should take for granted.

On if it's difficult for a scientist to say "I was wrong"

I think it is incredibly difficult and that's why I think very highly of Dana for having done this. Because, basically, what she did wrong, quote-unquote, is what people thought was normal back when she did it. What she's doing now, is not normal. It is very unusual for somebody to look back at their own work and come forward publicly with nothing to gain and say, "You know what, I don't think that this is really there and I'm just going to share my skepticism." ... That is very, very rare.

Copyright 2016 NPR. To see more, visit http://www.npr.org/.

300x250 Ad

300x250 Ad

Support quality journalism, like the story above, with your gift right now.

Donate