Moral Nostalgia and the Myth of the Authoritarian Past

February 27, 2007
Posted by Jay Livingston

Brad Wright, on his Christian sociologist blog, talks about “moral nostalgia.” Great coinage. It might even join those few other terms that have crossed over from sociology into the general vocabulary — terms like role model and self-fulfilling prophesy. (Can anyone think of some others?)

The idea that people, especially young people, are less moral than the previous generation is apparently irresistible. Blogger (and Montclair State sociology alumna) Trrish P has a link to a version of moral nostalgia called “Take Me Back to the Sixties.” Ah yes, the moral paradise of the sixties. Of course, this guy’s counterpart in the 60s was complaining that society then represented a sharp decline from some earlier golden era. And in the fifties too, parents lamented the moral decline among youth— the sort of thing satirized in the song “Kids” from Bye-Bye Birdie. The show opened on Broadway in 1960, so its sensibility was pure 1950s. Paul Lynde sang the song, and he did a great job of mocking the moral nostalgia while pretending to espouse it. Here’s a link to him doing a brief version at the 1971 TONY awards.

Kids! I don't know what's wrong with these kids today!
Kids! Who can understand anything they say?
Kids! They are disobedient, disrespectful oafs!
Noisy, crazy, dirty, lazy, loafers!
And while we're on the subject:
Kids! You can talk and talk till your face is blue!
Kids! But they still just do what they want to do! . . . .

Kids! They are just impossible to control!
Kids! With their awful clothes and their rock an' roll!


(Music by Charles Strouse, lyrics by Lee Adams)

Sociologists are not immune from moral nostalgia. In class, I often use an essay from an intro text which explains the truly steep rise in juvenile homicide in the late 1980s by pinning the rap, in part, on the “decline in the moral authority of the family.”

When I read that phrase, in my mind’s ear I always hear Paul Lynde singing “Kids!” But students see the statement as an obvious truth, and most of them say that in the ten years since the essay was written, the moral authority of the family has continued its regrettable slide.

The essay presents absolutely no evidence that the decline has occurred (it’s a short essay, and the authors have many fish to fry), so I use it as an example of how difficult it is to operationalize a concept like “moral authority of the family” and get evidence about it, especially for comparing past eras with our own. But I also suspect that the decline in family authority, at least among middle-class families, is a myth.

Has there ever been a generation when parents said, “You know, kids today are a lot better behaved than we were”? I suspect that even the parents of the “Greatest Generation” didn’t think the kids were so great. At least in the US, the idea that morals are slipping and that kids are less respectful and obedient is as old as the Republic, and it may have to something to do with our relatively non-authoritarian family and our emphasis on independence even for children.

But I think there’s a more general source for this myth of the authoritarian past. It’s common to hear parents say something like, “The things kids say and do today — I could never have gotten away with that with my old man.” (I usually imagine a man saying this, perhaps because authority is not so much an issue for women.) The man who says this pictures his own father as much more powerful than he, the speaker, is now. But that’s only because he is remembering his father from the perspective of a child. When he was a child, his father really was much more powerful than he was — so much bigger and stronger, it seemed the father could do whatever he wanted. But when that child grows up and thinks about himself today, he is not looking up from the viewpoint of his own small children. Instead, he sees himself from his own place in the larger world. He knows that he is certainly not the biggest or strongest person around, he knows that his actions are limited by all sorts of constraints that are largely invisible to children. He sees that he cannot control all aspects of his children’s lives.

It’s a short and obvious step from this perception — my father was more powerful when I was a kid than I am today — to the general idea that kids these days are disobedient, disrespectful, and impossible to control. And no doubt, his children will grow up remembering their own childhood as relatively authoritarian, and on and on through the generations.

Wisdom and Crowds Go to Hollywood

February 23, 2007 
Posted by Jay Livingston

The Wisdom of Crowds crowd loves to cite the ability of “prediction markets” to pick the Oscar winners. But this year, you don't need a prediction market to know which way the wind blows. All the major awards seem to be sure things, except perhaps Best Picture. Here, for example, are the prices on Best Actress nominees. You get 100 points if your choice wins. Here's what you pay.

Helen Mirren (The Queen) 94
Judi Dench (Notes on a Scandal) 2
Penelope Cruz (Volver) 2
Meryl Streep (The Devil Wears Prada) 4
Kate Winslet (Little Children) 1

In other words, people are willing to risk 94 points to win 6 on Ms. Mirren. If Judi Dench wins, her backers will get back 98 points of house money along with the two they paid.
Other consensus choices and current prices.

Director - Martin Scorsese (The Departed) 88
Supporting Actor - Eddie Murphy (Dreamgirls) 61 
Supporting Actress - Jennifer Hudson (Dreamgirls) 76 
Actor - Forest Whitaker (The Last King of Scotland) 82 
Documentary - An Inconvenient Truth 85

On Best Picture, the crowd’s wisdom is less obvious. Since Picture and Director usually go to the same movie, The Departed has an edge, but not much of one considering the consensus on its director.

The Departed 44
Letters from Iwo Jima 7
The Queen 4
Babel 22
Little Miss Sunshine 22

Unfortunately, the major prediction marketplace for the Oscars, Hollywood Stock Exchange doesn’t have markets for the lesser categories— the ones that seem as arcane as baseball records. Best sound design in a foreign documentary by a left-handed shortstop on a Wednesday. But one English bookmaker does have some of these. For the record, and to see how wise the crowd turns out to be, are some of them with the equivalent prices. The numbers show that the crowd is not so nearly in agreement as it is on the major awards:

Animated Short Film - The Little Matchgirl 58 
Art Direction - The Prestige 40
Cinematography - Children Of Men 29
Costume Design - Marie Antoinette 38
Original Score - Babel 47
Visual Effects - Pirates Of The Caribbean DMC 80


I just hope Ellen DeGeneres is at the top of her game, because if the crowd is wise, there won’t be much suspense about the winners. In reminds me of March 1992. I was teaching a Monday night class, and the date I had scheduled the midterm turned out to be Oscar night (it was still on a Monday back them). As a final multiple-choice question, just for fun, I had put, “The winner for best picture in tonight’s Oscars will be . . .” and listed the five nominees.

I had intended the question to lighten things up. What a miscalculation. What happened was that several students, after turning in their tests, complained that the question was unfair. How could they possibly be expected to know what was going to happen in the future, and besides what did any of this have to do with the criminal justice system, and so on. I assured them that I had no intention of including it in their test score.

When I marked the exams the next day, it turned out that of the 35 multiple-choice questions, that was the only one that everybody in the class had gotten right.

The Wisdom of Crowds. The Silence of the Lambs.

Is Anna Nicole Smith Still Dead?

February 21, 2007

Posted by Jay Livingston

Britney Spears is on the front page of tabloids like the New York Post again today, though in some papers she’s competing with Anna Nicole Smith. Aren’t you tired of these stories? Do you think the Anna Nicolecoverage is way out of line with what it deserves?

You’re not alone. A poll by The Pew Research Center for People and the Press finds that the People think the Press has overdone the Anna Nicole Smith story. In the survey taken a week ago, 61% said the story had been given too much coverage. Interestingly, 8% thought Anna Nicole merited even more press. Still, more than one person in ten said it was the story they’d followed most closely (18% among younger — 18-49 — women).

Nevertheless, here we are a week later, and Anna Nicole is still on the front page, and she's one of the first stories on the 11 o’clock news. Presumably, the people in the news business know what they're doing. So if people had really had enough of the Smith story, wouldn’t they pass up the newsstand and turn off the TV? Maybe this is one of those cases where there’s a discrepancy between what we say and what we do.

I wish we had something besides survey data to find out what news stories people are interested in. Surely Google, MSN, and Yahoo keep track of which stories people are clicking on. Do they make the data available?

In the meantime, news programmers will continue to feature Anna Nicole (and Britney) , news watchers will continue to tell pollsters “enough already,” and newscasters will continue to make comments, usually off-camera, like this one by CNN reporter Jack Cafferty.



Cafferty’s question, “Is Anna Nicole Smith Still Dead” is an allusion to a news anecdote of a half-century ago. In 1952, actor John Garfield died of a heart attack, which might have been news enough since he was a handsome Hollywood star, he was only 39, and he’d been blacklisted after refusing to name names when called to testify before the House Un-American Activities Committee.

But Garfield’s death was especially newsworthy because he had suffered the heart attack while making love, and the woman he was making love to was not the woman he was married to. The press squeezed the story to the last drop of ink, playing out every possible angle.

The anecdote, at least the way I heard it, goes that on a slow news day weeks afterward, editors were sitting around a table, trying to decide on the day's headlines. Nothing in the news seemed to have the attention-grabbing juice needed for the front page.

So someone suggested, “John Garfield Still Dead.”

(A personal note: Garfield's son David went to the same college I did. We weren't buddies, but we knew one another by name. In Googling around for this post, I discovered that David, like his father, became an actor, and like his father he died of a heart attack. He was 52.)

Mixi Messages and Wa

February 18, 2007
Posted by Jay Livingston

When I spent a few months in Japan many, many years ago, the Japanese often told me that Americans were “frank.” At first, I took this as a compliment. Only later did it dawn on me that what they were really saying was that Americans tended to shoot off their mouths, saying whatever they thought, without much regard for how it would affect others. Americans seemed more interested in expressing their own individual opinions and quite willing, in the process, to trash the overall harmony within the group, what the Japanese call “wa.”

Sportswriter Robert Whiting’s 1989 book You Gotta Have Wa is about Wa in Japanese baseball. (The title is an allusion to the song “You Gotta Have Heart,” from the baseball-themed musical Damn Yankees.) Whiting describes the difficulties that arose when American baseball players who couldn’t quite stay in the majors wound up in Japan. Like good Americans, they would see their main task as playing well, getting hits, etc. But they would ignore or even resent a task that most Japanese would take for granted — becoming and being a member of the team, especially in the sense of subordinating their own preferences and accomplishments to the overall Wa of the group.

I myself unwittingly committed these cultural gaffes, one of them so egregious that it’s a wonder I wasn’t immediately ostracized if not executed.

I remembered this distant past again with some embarrassment when I read in Wired Online about MySpace moving into Japan, where the popular site is Mixi. Rupert Murdoch, whose NewsCorp owns MySpace (and a lot of other media), has never been shy about expanding his empire, and since November MySpace has run a Japanese site.

The differences between the MySpace and Mixi reflect the broader cultural difference between individualism and Wa. The name says it all: MySpace “is about me, me, me, and look at me and look at me and look at me,” says an American media executive in Japan. “In Mixi, it's not all about me. It's all about us.” In fact, American parents are surprised and often dismayed by how much personal information teenagers will put up on their MySpace pages for any stranger to see, and by the way kids will use the Internet for nastiness and character assassination.

But Mixi messages tend to be more supportive. It also is based more on groups than individuals. To join, you need an introduction from someone who is already a member, and communication remains centered among clusters of friends or people who share interests. It’s more a way to maintain relations among a group than a way of meeting new people or expressing yourself.

I checked the home pages for the two sites today. MySpace is very in-your-face. Someone in a photo on the cover of Nylon (screaming bold yellow typeface) seems to be screaming and sticking her tongue right into the camera. Just above, “Meet Sal” shows a very confrontational Sal, and then there's Jonathon in battle gear, and over on the right of the screen “cool new people” being cool by acting wild and crazy.



The Mixi login page shows a girl sitting in a field of grass, quietly reading a book, as her friend walks up to her. Both girls are dressed conventionally, and we see them from a distance. The words, not in garish yellow but almost blending in to the blue summer sky, say, “community entertainment.”

It will be interesting to see what happens with MySpace in Japan. Will the medium itself shape the content? Will it speed the development of a more individualistic and less group-oriented culture among Japanese youth, a change which has been slowly evolving in any case? Or will the culture reshape the site and make it more typically Japanese?

Necessities

February 15, 2007
Posted by Jay Livingston

In debates about poverty in the US, conservatives will usually point out that many people with incomes below the poverty line have a standard of living equal to that of middle class people of earlier times or other places. Here’s a typical version, by Robert Rector, written in 1990 and posted on the conservative Heritage foundation Website. After noting that the poor own homes (38%) and cars (62%), Rector concludes,
“Poor” Americans today are better housed, better fed, and own more property than did the average U.S. citizen throughout much of the 20th Century. In 1988, the per capita expenditures of the lowest income fifth of the U.S. population exceeded the per capita expenditures of the median American household in 1955, after adjusting for inflation.
(The quotation marks are enough to clue you in that Rector doesn’t think that a family living on $15,000 a year is really poor. They’re merely “poor.”)

By a similar logic, the “poor” today are better off than J. P. Morgan because Morgan didn’t have a washing machine. And they’re better off than Louis XIV because the Sun King didn’t have indoor plumbing.

Bill O’Reilly put the “not really poor” argument more succinctly, “Even the poor have color television sets and pretty much everything they need.”

O’Reilly at least comes closer to the real issue. Being poor is not simply a matter of what you have. It’s what you have compared with what you need.

But what do you need? In the 1600s, nobody needed a flush toilet because nobody had one. And for a similar reason, nobody in the Gilded Age of the late 1800s needed a washing machine.

Needs are determined by what people have. If nearly everybody in a society has a car, that society becomes a place where a car is a necessity. And if you can’t afford to buy what people think are the necessities, you are poor.

Here are the results of Pew Research poll published late last year. The poll asked people whether they thought an item was "a necessity" or "a luxury you could do without."
(You don’t have to be a social constructionist to see that what people need is not much different from what they think they need.)


(Click on the graph to see it in a larger version)

Have you got what it takes?

Note that ten years ago, a microwave was a luxury for two-thirds of the population. Now it's a necessity for two-thirds. I’m in the minority on that one, and I'm missing two of the other top five items as well.

Operationalizing Investment

February 13, 2007

Posted by Jay Livingston

How do you turn a vague concept like “decline in the moral authority of the family” into a variable you can actually use in research? This week, I told my students that operationalizing concepts is the key to thinking like a sociologist. Of course, at the beginning of the semester when I introduced the idea of social facts, I told them that thinking in terms of social facts was the essence of thinking like a sociologist. (Moral authority happened to be the subject that came up; it's not the subject of this post.)

I also tell students that no matter how you operationalize it, someone is going to complain that you've left something out. And they're right. Most of these conversions from abstract to measurable are imperfect. But you go to research with the variables you have, not the variables somebody else would like you to have.

I've been thinking about this problem in connection with parental “investment” in children. Back in November, one of the candidates we interviewed for a position this year, Kristen Schultz Lee, taught a sample class based on her research, which was about “parental investment in children’s education” and whether it was greater for boys than girls. I thought that was an interesting idea. I don’t usually think of what my parents did for me as an “investment,” i.e., something with an eventual payoff for the investor.

Kristen operationalized “investment” (at least in her talk to my class) as merely how far the child went in school. It works, but I wondered if there might not be more to this idea of investment. After all, she did her interviewing in Japan, where, supposedly, the stereotype “kyoiku mama” (education mama) stays up late with her son, helping him with homework and going over drills in preparation for the all-important college entrance exams. (Click on that link and you’ll get several other negative female stereotypes in Japanese.)

Now an article in the latest issue of the American Sociological Review looks at something similar. It compares adoptive and biological parents’ investment in children’s education, but the researchers (Laura Hamilton, Simon Cheng, and Brian Powell) define investment more broadly: “the economic, cultural, social, and interactional resources that parents provide for their children.”
  • Economic: the things that money buys (books, computers, private school).
  • Cultural: not just music lessons, but even the time parents spend with kids, reading or just playing.
  • Interactional: what non-sociologists call “talking” with the child.
  • Social Capital: Talking with other school parents, going to PTA meetings, etc.
(I guess I’m not much of a sociologist after all. In all those hours I spent squeezing Play-doh with my kid, I never thought of it as providing him with Cultural Capital. And when I went out for coffee with other parents after drop-off, I thought I was just putting off going to work. Now I realize I was building up my kid’s Social Capital.)

The importance of this research lies in its implications for biology-based theories about human behavior. These theories, under names like “sociobiology” or “evolutionary psychology,” have become increasingly influential in social science. In this case, they would probably predict that adoptive parents would be less invested in their children.

But this study found nothing of the sort: “two-adoptive-parent families invest at similar levels as two-biological-parent families but still at significantly higher levels in most resources than other types of families.”

Interestingly enough, Kristen Schultz Lee had included similar variables in her research (homework helping, cram schools) though she didn’t mention them in my class. Somewhat in keeping with the kyoiku-mama stereotype, extracurricular activities were somewhat gender-related (girls do cultural classes, boys do academic classes). But I don't think the overall differences in education were as large as I might have expected.

(Hi, Kristen. As I write this, Oswego County has had twelve feet of snow in the last ten days or so. In Montclair, there’s not a snowflake to be seen.)

Minding the Gap

February 11, 2007

Posted by Jay Livingston

Back in November, I blogged about Google Trends. Now Google has another cool tool, still in beta. It's called Gapminder, and as the name implies, it shows the gaps among countries of the world. You can choose from about a dozen variables, mostly economic and health data, and get an XY graph with the size of each dot corresponding to the population of the country. You can also select which countries to identify with a name label. Here’s a chart showing the proportion of doctors and per capita income. (The actual screen will look clearer than this reprodution.) Belarus, like many of the other former Soviet republics, has much lower income than the US but slightly more doctors per capita.



The flash presentation also tracks changes since 1975. The chart below shows trends in infant mortality and income for the US and the Czech Republic (which has data starting in 1992, the year of its founding).



A similar site, gapminder.org, has slideshow presentations of some of the same variables. It groups countries to show differences among countries of similar economic levels.



Losing Our Religion?

February 9, 2007

Posted by Jay Livingston

I have been assuming that the Bush years have been good for religion. His “faith-based initiatives” have sent billions of government dollars to churches and other religious organizations. And when religion-based policies have conflicted with scientific findings, guess which carried the day, at least in the federal government.

Thomas Jefferson wrote famously of the “wall of separation between church and state.” George W. Bush seems to have heard a voice telling him to tear down that wall. More than any other president in modern times, or perhaps since the founding of the republic, Bush has tried make religion a part of government and politics.

Bush’s policy success in tearing down Mr. Jefferson’s wall does not seem to have won over more of the public. Here are the results of two Gallup polls, one taken just as Bush was coming into office, the other just last month.

The question was: "Next, I'm going to read some aspects of life in America today. For each one, please say whether you are very satisfied, somewhat satisfied, somewhat dissatisfied, or very dissatisfied. How about the influence of organized religion?"


Americans are still satisfied with the role of religion (56% vs. 39%) , but dissatisfaction has grown during the Bush years. Do people want to see the Bush trend continue?

The proportion of Americans saying they want religion to have less influence has increased by 45% (from 22% in 2001 to 32% in 2007).

It's hard to know what to make of the change. Thirty-two percent wanting less religious influence (maybe only a little less), is still a clear minority, and America is still far more religious than other advanced industrialized countries. Executive, legislative, and judicial branches have greatly favored religion.

The puzzling irony is that despite its dominance, the Christian majority feels threatened. Nearly sixty percent of Americans agreed that "Christianity is under attack in the US today." OK, this does come from a Fox News poll, and maybe people have just been listening to Bill O'Reilly. But it's possible they see these Bush-era trends as omens for the future.

Superbowl Ad Work

February 7, 2007
Posted by Jay Livingston

TV commercials are compressed version of some aspect of our culture. After all, if you’re going to spend $2.5 million just to get your ad on the air for thirty seconds, you want to be very sure that it resonates with widely held ideas. The straight commercials embrace the dominant values and give them a big kiss — Coca Cola’s everyone-happy-together, Chevy’s America-is-best. The funny ads take a more critical view of the culture.

Several of the Superbowl ads were about work. On the straight side was the GM robot ad. A robot drops a screw, loses his job at the GM plant, and descends first to holding up signs, then working in a fast food joint, and finally committing suicide by jumping off a bridge, all while the soundtrack plays the mawkish “All By Myself.” The only spoken words in the ad come at the end: “The GM 100,000 mile warranty — it’s got everyone at GM obsessed with quality.”

A full minute showing how capitalist competition benefits consumers and makes workers virtuous. It’s one of the core ideas of conservatism. For example, here’s David Frum (he worked in the Bush White House, even wrote a book about W. called The Right Man, and writes for The National Review)
The great, overwhelming fact of a capitalist economy is risk. Everyone is at constant risk of the loss of his job, or of the destruction of his business by a competitor . . . Risk makes people circumspect. It disciplines them and teaches them self-control . . . Social security, student loans, and other government programs make it far less catastrophic than it used to be for middle-class people to dissolve their families. Without welfare and food stamps, poor people would cling harder to working-class respectability than they do now.

The Career Builder ads offered a less laudatory picture of competition in the workplace— the one about performance assessment and this one about promotion.




The most curious ad in this category was SalesGenie.com. At first I thought it was going to be another spoof on the success-worshipping worker. I thought that the incredibly successful salesman— red Ferrari, boss’s invitation home for dinner, etc.— was going to be held up to ridicule as the obnoxious guy that he seems to be. But no, he’s the one we’re supposed to identify with. He’s supposed to make us want to use the same product he does.


I wonder if SalesGenie wasted several million dollars on this one.

Twilight Time?

February 4, 2007
Posted by Jay Livingston

The death of Seymour Martin Lipset a month ago provided the news peg for the Wall Street Journal to run a piece proclaiming “The Twilight of Sociology.” Lipset was the WSJ’s kind of guy — a 1930s Trotskyite socialist who became a neoconservative.

The author of the article, Wilfred McClay, a professor of humanities, sees the 1950s and 60s as the “golden age” of US sociology, but the titans of that era are dying off (Lipset, Rieff, and Riesman in the past year or so). And according to McClay no new giants are rising up to take their place. Where are the grand sociologists?

McClay, the good conservative (what else would you expect to find in the WSJ?) first blames liberal politics. “Academic journals and scholarly monographs were given over to supporting the reigning views of race, gender and class — and fiercely suppressing any inquiry that might challenge these views.” Then he blames the concept of social construction: “many sociologists came to believe, all reality was ‘socially constructed.’”

McClay exempts the sociology of religion from his condemnation: “a lively subfield, populated by outstanding figures such as Robert Bellah, Robert Wuthnow and Peter Berger.” The irony of course is that Peter Berger is co-author of the seminal book on social construction.

McClay also blames “scientism,” making much the same criticism that C.Wright Mills leveled at the “abstracted empiricism” style of sociology fifty years ago. McClay never mentions Mills among the giants of that golden age, probably because Mills was guilty of what McClay sees as current sociology’s main sin— “misguided activist zeal.”

McClay urges sociology to recover its potential for greatness by going back to “one of the “abiding themes of ‘old sociology’: how the stubbornness of social forces circumscribes what is possible for us as individuals.”

I’m not sure that McClay is right about anything. Are there no sociologists of the stature of Lipset or Riesman today? It’s often hard to tell who the giants are until you look back from the perspective of many years. When you’re standing right next to them, they may not seem so impressive (though Lipset was indeed an imposing physical presence; so was Alvin Gouldner, a Lipset contemporary probably too liberal for McClay to mention).

And if it turns out that there are no towering figures, is the cause to be found in our ideas, our ideologies, and our activism? McClay is so eager to pin the blame on progressive ideas that he ignores his own advice. He says nothing about the social forces that constrain sociology today. The social and economic realities of universities, journals, granting agencies, and publishers probably have a greater impact on the form and content of our work than does our ideology.

It may also be that we are in a “normal science” phase, still working out the implications of ideas laid down in the social scientific revolution of a century or more ago. Sociologists in this phase may still do great work — even those who think that Lipset was a great sociologist could hardly argue that he shook the theoretical or methodological foundations of the field— but they are unlikely to be seen immediately as giants.

(I offer no link to McClay’s article because the Wall Street Journal does not provide free access to its articles. But if you have the AB Inform database, you can find it: February 2, 2007; Page W13)