Impostor Syndrome and Cultural Rules

January 6, 2020
Posted by Jay Livingston

Many years ago, I was talking with a successful business consultant. He had a PhD in business from a prestigious school, was on the faculty of another pretigious business school, and for several years now, corporations were paying him large sums to come in for a day or two. Still, he didn’t feel secure. “I keep thinking, What if they find out?”

I remembered that conversation when a thread on impostor syndrome showed up in my Twitter feed. What set people off was this tweet in response to someone who had tweeted about her own feelings of being an impostor.


True, it’s the individual’s problem, not society’s. Society doesn’t suffer if you feel like an impostor. But that doesn’t mean that social factors are irrelevant. What if impostor syndrome is more prevalent and more persistent among women than men? That would certainly suggest that the causes are social and not just psychological. In fact, many of the responses to this tweet argued that a person’s race, class, gender, age and other social variables might affect the probability that they would feel like an impostor. Or as Nathalie Olah in the Guardian (here) put it, “what seems more likely is that impostor syndrome is a rather natural reaction of anyone from a working-class, disadvantaged or minority background to the various biases they face on a daily basis.”

The studies on these variables are far from unanimous, if only because there’s no standardized way of measuring the crucial variable of feeling like an impostor. But I’d like to throw in one more variable — culture. My guess is that impostor syndrome is more prevalent and more deeply affecting in societies that emphasize success, societies like, oh I don’t know, the United States.

The British, by contrast, seem not so obsessed by success. In some instances, not being a success and not having the right stuff can put a person one up on the others in the room. I’m thinking here of Humiliation, a party game played by literature professors in Changing Places by British novelist David Lodge. Each person has to name a literary work, preferably one in the canon, that they have never read. You get a point for every other player who in fact has read it. The winner will be the one who has not read the classics that a lit professor would be expected to be familiar with — in other words, the biggest impostor.

Presumably, for the British and for less success-obsessed Americans the game is just a bit of fun. But for Howard Ringbaum, a success-driven American professor, the game creates an unresolvable conflict. “He has a pathological urge to succeed and a pathological fear of being thought uncultured, and this game set his two obsessions at war with each other, because he could succeed in the game only by exposing a gap in his culture.”

Ringbaum and Humiliation are fictions invented by a novelist. But Kate Fox is an anthropologist, and in Watching the English: The Hidden Rules of English Behaviour, she describes a similar game of “competitive self-deprecation” when real Brits talk about their home-improvement projects.


When showing visitors the results of your DIY efforts . . . a strict modesty rule applies. Even if you are highly skilled, you must always play down your achievements, and if possible play up your most embarrassing mistakes and blunders... 

‘I managed to burst three pipes just laying the carpet!’ 


‘We bought an expensive carpet, but I ruined it by cutting it four inches short, so I had to build some bookcases to cover the gap.’


‘You think that’s bad: it took me an hour and three cups of tea to put up a coat-hook board, and then I found I’d hung it upside-down!’


Here’s more British fun: In a dinner-party scene in the film “Notting Hill,” the competition involves not a specific area like literature or home-improvement but more or less the total self. Except for the newcomer in their midst – a famous actress (played by Julia Roberts), the guests all know one another well, and as the the host says, “I've long suspected, that we really are the most desperate lot of under-achievers.” At dessert time, there’s one extra brownie, and the host says he will award the brownie to “the saddest act here.”

The video is nearly four minutes long, but it’s worth watching.   



I can’t think of anything similar in American novels or films. Maybe such scenes exist, and in any case, the connection between cultural fictions and real life is not always clear. But the larger point is that impostor syndrome depends on the rules of the game. Where the rules of everyday life allow for the disclosure of personal flaws,  people will be less likely to feel like an impostor and that feeling will be less salient.

I Do Not Like Thee, Dr. Phil

January 4, 2020
Posted by Jay Livingston

I’ve had that title in my head for years now, but I’ve never written the blog post that goes with it. That’s partly because I don’t think that many people know this nursery rhyme.

I do not like thee Dr. Fell
The reason why I cannot tell.
But this I know, and know full well,
I do not like thee, Doctor Fell.

Exactly right. I do not like Dr. Phil, yet I would be hard pressed to articulate the reason why. In order to do so, I’d have to watch a lot of his TV segments, and since I don’t like him, I’d find that unpleasant, so I don’t. Which is the other reason I never wrote that post.

But now we have this photo of the dining room in Dr. Phil’s house.

(Click for a larger view.)

Would you refer a friend to a therapist whose dining room featured cartoonish animal statuary and a wall display of a dozen military-style guns? It’s not a decision you’ll have to make. Dr. Phil has not been licensed to practice since the earlier part of this century.

Also, there are some minor factual quibbles, which William Hughes at AVClub (here) raises and dismisses.

McGraw does not currently live in the California estate, which is now owned by the family trust and occupied at present by McGraw’s son. And there’s no evidence that he had any impact on its decoration, beyond, presumably, footing at least some of the cash for its eclectic collection of guns, “FUCK” paintings, and big giant lips. And yet, we have to assume that McGraw has at least stood in this building in its current state, muttering approvingly of the dining room gun wall, the gaudy Star Wars art, and, in what has quickly become our favorite touch, the legion of small Lego-ish bears scattered around the home, including two who are dressed up like Batman and the Joker, because hey, fuck it, that’s fun.

The larger point is first that sometimes your initial impression of a person, immediate and without rational thought, is right; and second, that you may have to wait a long time — until you see their dining room decor — to know that you were right.

Mrs. Maisel Gets One Right

December 26, 2019
Posted by Jay Livingston

Since the new season of Mrs. Maisel dropped not long ago, my post from nearly two years ago about its language anachronisms (here) has been getting some action. It’s still the most frequently viewed and commented-on item on this blog, and some of the newer comments made it clear that the anachronisms were still in bloom.

I watched first couple of episodes recently, and sure enough, in Episode 2, “It’s the Sixties, Man!” we got contextualize.



In a blog post (here) about trendy words at the ASA meetings, I  cited contextualize, but that was a report from academia in 2013, not New York family life in the early 1960s.

To the show’s credit, it did have a period-perfect language moment. Joel has been speaking with the older Chinese couple who own the Chinatown space he has rented, planning to turn it into a club. He discovers that the space includes a Chinese gambling parlor. Worried about trouble from the police, he meets with the owners. After some back-and-forth to deal with Joel’s fears, the couple — through their translator,* the young med-student Mei — allude to a murder, significantly raising Joel’s level of anxiety.

After the Chinese couple leave, Joel is discussing the matter with Mei. What about the murder, he asks.


Talk of the “put-on” and “putting you on” came on the scene in the late 1950s, starting among young, hip people like Mei and eventually spreading throughout the society. I thought that its use had declined in the 21st century, but Google nGrams shows that at least until 2000, it was still found in books.


Still, my impression is that we rarely refer to “putting people on” these days. But what has replaced it?

--------------
* Another anachronism for anyone still keeping score — a language anachronism of sorts : the owners are speaking Mandarin. In the early 1960s, the language spoken in Chinatown was Cantonese. Immigration from Mandarin-speaking areas of China did not begin until the 1970s at the earliest.

Acting and Reacting as an Agent of Culture — Moi?

December 21, 2019
Posted by Jay Livingston

A long time ago I heard Margaret Mead speak, and one of the only things I remember her saying was this: “A society has to teach its cultural ideas to its children in terms so clear that even a social scientist can understand them.”

I am, allegedly, a social scientist, but only an encounter with something very unusual can jar me into seeing my own culture. Like most people, I usually take it for granted, like the air we breathe. That was the point of the previous post, where a psychologist was urging dog owners to give their dogs more choice. It took this extending of human culture to canines to remind me of the great emphasis American culture gives to individual independence and choice. All those times that I had heard parents, me included, ask their kids if they wanted Cheerios or Chex, it had never occurred to me that we were drilling a particular American value into the minds of our little tots. I thought we were just being parents.

I had a similar cultural-blindness experience a few years ago. A student born and raised in Turkey came for his obligatory meeting with his advisor — me. He was a grown man in his forties. “What courses I should take?” he asked. I explained about the core requirements and recommended he take the first in the sequence. “And then there are the electives” I said and showed him the list.

“Which courses I should take?

I explained that these were electives. He could take any of the ones we were offering that semester. If you’re interested in family, you could take that course, I said. If you’re interested in religion, we have that elective.

“Yes, but which ones I should take.”

I found it incredibly frustrating. What was so complicated about the concept of electives? It did not occur to me that our differences were cultural. I was so thoroughly an American I that could not imagine anyone rejecting the freedom to make their own choice of courses. Who would not seize that opportunity? Only someone who did not understand.

In retrospect, I now think that he did in fact understand. He just didn’t think it was such a great idea that the choice should be made by him rather than by a professor — department chair no less — who knew much more about the courses and the instructors. Maybe he was right.

There’s something else to be said for his approach. It creates a personal link between the advisor and the student in a way that the independent-choice model conveniently avoids. When he was asking me to choose courses for him, the thought crossed my mind that I could tell him to sign up for some of the low-enrolled courses that were in danger of being cancelled — courses students were avoiding because of the reputation of the course, the instructor, or both. That certainly would have made things easier for me as department chair. But I now felt that I had to look out for his best interests as well. I felt an obligation that was different and perhaps stronger that what I would feel towards other students.

As I say, when all this was happening, I didn’t think about the underlying cultural differences. I just felt uncomfortable. I will leave for another post the time when he presented me with a large assortment of almonds, figs, pistachios, etc., while I tried to explain to him the university rules about gifts.

Raise Your Dog to be an American

December 19, 2019
Posted by Jay Livingston

My local online webiste WestSideRag today ran an article with the title “Barnard Researcher Has Studied The Minds of Upper West Side Dogs, and They’re Way More Complicated Than You Think.”

I don’t have a dog, but I started reading.* And as I did, I saw that while the article was about dogs, it was more importantly a document about American culture, particularly our values and beliefs about Choice. We value individual choice as inherently good. We also believe that choice is beneficial and that denying people the freedom to choose will in some way harm them. So we insist that people make their own choices.

Recognizing the wonderfulness of choice is not something that comes naturally. You have to be carefully taught. And it’s never too early to start. It’s just that most of the time, we don’t think that we are hammering American cultural ideas into our kids’ psyches. We just think we’re raising them the right way.

In My Freshman Year, an ethnography of an American university, Rebekah Nathan** includes a chapter (“As Others See Us”) about the perceptions of the foreign students. A Korean student tells her:

Everything here is: “What do you want?” “What do you think?” “What do you like?” Even little children have preferences and interests in this country. I hear parents in restaurants. They ask. a three-year-old child, “Do you want French fries or potato chips?” Every little kid in this country can tell you, “I like green beans but not spinach, I like vanilla but not chocolate, and my favorite color is blue.”

If we think it’s good for three-year olds to make their own choices, why not dogs?

All dog owners should allow their dog to make certain choices, according to Horowitz, who strongly believes that giving dogs choices increases their welfare. . . . Owners should “allow the dog to make their own choice as opposed to your definition of the walk.” She recognizes that people want to feel in control, but points out “what we are in control of is to let the dog have a life in which the dog is partly choosing. This is something we want to give to anyone we love.”

WestSideRag has a relatively small readership — we’re not talking Slate.com — and an article extending our ideas about choice to dogs is extreme. But often the extreme case can call attention to the less extreme versions that are widely taken for granted and unnoticed. In America, even those with a more authoritarian outlook find it hard to refute arguments based on the idea of choice. It’s not just liberals who ask their kids what kind of cereal they want. 

-------------------------
* What originally drew me to the article was the opening paragraph, which contained a pun that I am nearly certain was unintended.

(Click on the box for a larger and clearer version.)

** “Rebekah Nathan” is a nom de plume. The author, Cathy Small, probably wanted to remain anonymous since she was writing about the school where she teaches. The ruse did not work for very long.

“Real” Disney Princesses

December 17, 2019
Posted by Jay Livingston

What do you if you’re a good feminist parent — you want kids to see genders as similar, not exaggeratedly different — and your daughters keep watching those Disney princess movies?

If you’re Philip Cohen, you start blogging about gender dimorphism to call attention to how unrealistically the Disney characters are drawn. The blogposts have titles like “Disney’s dimorphism, ‘Help! My eyeball is bigger than my wrist!’ edition.” You link to data about the sizes of hands, wrists, necks, etc. among real men and women.*

A post gets picked up at more widely read sites like Slate, and you get comments complaining that  “The less realistic the proportions, the more endearing and charming we find the character. The closer to realistic they are, the creepier/blander they can become.”

Maybe you wonder: What if someone Photoshopped the Disney characters to make them look more like real people? Well, someone has. Jirka Vinse Jonatan Väätäinen, a graphic designer in Finland (not too far from “Frozen” territory) has shrunk those princess eyeballs, enlarged those princess wrists and necks, and posted the results on his Website (here). See if you can tell which is which.


(Click for a larger view.)

-------------------
*You can find Philip’s dimorphism posts at his Family Inequality blog here.

Sacred Interiors — Full and Empty

December 17, 2019
Posted by Jay Livingston

Different Games, Different Rules: Why Americans and Japanese Misunderstand Each Other by Haru Yamada. It’s been sitting on my shelf for a while, but I just now started reading it and came across this.

This contrast between the Judaeo-Christian and Buddhist viewpoints is probably what prompted my mother to look up at the domed ceiling of a Catholic church in Florence painted with cherubs and scenes of men and women in heaven and hell, and say, “I guess the idea is to fill your mind with sacred thoughts, not to empty it.”

Okaasan (Mom) nailed it.

(Ceiling of the Duomo in Florence — Brunelleschi, 1436.
Click on an image to enlarge.)

 (Tenryuji in Kyoto. Originally built in 1339.)

Replication Complications

December 14, 2019
Posted by Jay Livingston

Some people can tell a joke. Others can’t. Same joke. One person has everyone laughing, the other gets zilch. Does the null response mean that the joke isn’t funny?

 What we have here is a failure to replicate.

 A couple of days ago, the Psychology Archive (PsyArXiv) published results showing a failure to replicate an experiment on Terror Management Theory (TMT).* Among the possible reasons for this failure, the authors say,

There was substantial nuance required in implementing a successful TMT study. . . . These nuances include how the experimenter delivers the experimental script (tone, manner ). . .

I offered this same idea five years ago. I didn’t use the term “nuance.” Instead, I speculated that some experimenters knew how to “sell it” —  “it” in this case being the basic manipulation or deception in the experimental set-up. You can read the whole post (here), but here’s a somewhat shorter replication. I’m copy-and-pasting because as we get more results from replication studies, it’s still relevant. Also, I liked it.

*              *            *             *


One of the seminal experiments in cognitive dissonance is the one-dollar-twenty-dollar lie, more widely known as Aronson and Carlsmith, 1963. Carlsmith was J. Merrill Carlsmith. The name itself seems like something from central casting, and so did the man – a mild mannered, WASP who prepped at Andover, etc. Aronson is Eliot Aronson, one of the godfathers of social psychology, a Jewish kid from Revere, a decidedly non-preppy city just north of Boston.

In the experiment, the subject was given a boring task to do — taking spools out of a rack and then putting them back, again and again — while Carlsmith as experimenter stood there with a stopwatch. The next step was to convince the subject to help the experimenter. In his memoir, Not by Chance Alone, Aronson, describes the scenario.

[Merrill] would explain that he was testing the hypothesis that people work faster if they are told in advance that the task is incredibly interesting than if they are told nothing and informed, “You were in the control condition. That is why you were told nothing.”

At this point Merrill would say that the guy who was supposed to give the ecstatic description to the next subject had just phoned in to say he couldn't make it. Merrill would beg the “control” subject to do him a favor and play the role, offering him a dollar (or twenty dollars) to do it. Once the subject agreed, Merrill was to give him the money and a sheet listing the main things to say praising the experiment and leave him alone for a few minutes to prepare.

But Carlsmith could not do a credible job. Subjects immediately became suspicious.

It was crystal clear why the subjects weren't buying it: He wasn't selling it. Leon [Festinger] said to me, “Train him.”


Sell it. If you’ve seen “American Hustle,” you might remember the scene where Rosenfeld (Christian Bale) is trying to show the FBI agent disguised as an Arab prince how to give a gift to the politician they are setting up.  (The relevant part starts at 0:12 and ends at about 0:38)



Here is the script:


Aronson had to do something similar, and he had the qualifications. As a teenager, he had worked at a Fascination booth on the boardwalk in Revere, Massachusetts, reeling off a spiel to draw strollers in to try their luck.

Walk right in, sit in, get a seat, get a ball. Play poker for a nickel. . . You get five rubber balls. You roll them nice and easy . . . Any three of a kind or better poker hand, and you are a winner. So walk in, sit in, play poker for a nickel. Five cents. Hey! There’s three jacks on table number 27. Payoff that lucky winner!

Twenty years later, he still had the knack, and he could impart it to others.

I gave Merrill a crash course in acting. “You don't simply say that the assistant hasn't shown up,” I said. “You fidget, you sweat, you pace up and down, you wring your hands, you convey to the subject that you are in real trouble here. And then, you act as if you just now got an idea. You look at the subject, and you brighten up. ‘You! You can do this for me. I can even pay you.’”

The deception worked, and the experiment worked. When asked to say how interesting the task was, the $1 subjects give it higher ratings than did the $20 subjects. Less pay for lying, more attitude shift.

 The experiment is now part of the cognitive dissonance canon. Surely, others have tried to replicate it. Maybe some replications have not gotten similar results. But that does not mean we should toss cognitive dissonance out of the boat. The same may be true for TMT. It’s just that some experimenters are good at instilling terror, and others are not.

----------------------------------
  * If you’ve never heard of TMT (I hadn’t), it’s basically the idea that if you get people to think about their own mortality, their attitudes will become more defensive about themselves and their group. Of the twenty-one replications, a very few got results that supported TMT, a very few got results that contradicted TMT. Most found no statistically significant or meaningful differences. 

Here’s the set-up for the independent variable: The subjects in the Terror condition were asked to write about “the emotions they experienced when thinking about their own death, and about what would happen to their physical body as they were dying and once they were dead.” The non-Terror subjects were asked to write about the same things about watching television — e.g., what happens to your physical body when you watch TV. (I am not making this up.)