Monday, March 14, 2011

A Simple Model of Cults of Personality

(Apropos of nothing in particular, though this article on Gaddafi’s cult of personality and this article on the indoctrination of children at a school in Libya probably had something to do with it. I’m also lecturing tomorrow on the mechanisms of control used by dictators, and this is something I might want to tell my students; writing helps for self-clarification).

Cults of personality are hardly ever taken seriously enough. They are often seen as a sort of bizarre curiosity found in some authoritarian regimes, their absurdities attributed to the extreme narcissism and megalomania of particular dictators, who wish to be flattered with ever greater titles and deified in ever more grandiose ways. And it is hard not to laugh at some of the claims being made on behalf of often quite uncharismatic dictators: not only is Kim Jong-il, for example, the greatest golfer in the world, but he also appears to have true superhero powers:

In 2006 Nodong Sinmun published an article titled ‘‘Military-First Teleporting’’ claiming that Kim Jong-il, ‘‘the extraordinary master commander who has been chosen by the heavens,’’ appears in one place and then suddenly appears in another ‘‘like a flash of lightning,’’ so quickly that the American satellites overhead cannot track his movements. (Ralph Hassig and Kongdan Oh, The Hidden People of North Korea, p. 55).

To the extent that cults of personality are taken seriously, moreover, they are often analyzed in terms of their effects on the beliefs of the people who are exposed to them. Thus, the typical (if at times implicit) model of how a cult of personality “works” is one in which people are indoctrinated by exposure to the cult propaganda and come to believe in the special qualities of the leader, no matter how implausible the claims, simply because alternative sources of information about the leader do not exist. On this model, the cult of personality creates loyalty by producing false beliefs in the people, and the best way of combating its effects is by providing alternative sources of information. Even scholars who are well aware of the basic unbelievability of cults of personality often speak as if their function were to persuade people, even if they fail to achieve this objective. Hassig and Oh, for example, write that “[e]ven in North Korea few people have been convinced by this propaganda because since Kim came to power, economic conditions have gone from bad to worse” (p. 57) which makes it seem as if the main purpose of the cult of personality were to convince people of the amazing powers of Kim Jong-il.

But this way of thinking about cults of personality misses the point, I think. Not because it is entirely wrong; it is certainly plausible that some people do come to believe in the special charisma of the leader because they have been exposed to the propaganda of the cult since they were children, though the evidence for this is scarce. In Lenin’s Tomb, David Remnick’s compulsively readable account of the last days of the Soviet Empire, one occasionally comes across descriptions of such people, usually elderly men and women who reject or rationalize any and all evidence of Stalin’s “errors” and hang on to their belief in Stalin’s godlike powers. Remnick also tells many stories of people who claim that they used to believe in Stalin but lost their faith gradually, like groupies who eventually outgrow their youthful infatuation with a band. And there is evidence that significant numbers of Russians (how many exactly it’s hard to say) remain “proud” in some sense of Stalin, though this “pride” in Stalin appears to have much less to do with Stalin’s actual cult of personality than with Stalin’s supposed achievements as a leader (e.g., winning WWII, industrializing the country, making Russia into a “high status” country that needed to be taken seriously on the world stage, etc.). Identification with a leader can be a form of “status socialism,” a way of retaining some self-respect in a regime that would otherwise provide little except humiliation. Yet, though I do not want to deny that cults of personality can sometimes “persuade” people of the superhuman character of leaders (for some values of “persuade”) or that they draw on people’s gullibility in the absence of alternative sources of information and their need for identification with high status individuals, they are best understood in terms of how dictators can harness the dynamics of “signalling” for the purposes of social control.

One of the main problems dictators face is that repression creates liars (preference falsification, in the jargon), yet it is necessary for them to remain in power. This is sometimes called the dictator’s dilemma: it is hard for dictators to gauge their true levels of support or whether or not officials below them are telling them the truth about what is going on in the country because repression gives everyone an incentive to lie, yet they need repression if they are to avoid being overthrown by people exploiting their tolerance to organize themselves. Moreover, repression is costly and works best when it is threatened rather than actually used. All things considered, then, a dictator would often prefer to minimize repression – to use it efficiently so as to minimize its distorting effects on his knowledge and on its effectiveness. He can either allow relatively free debate, and run some risk of being overthrown (this happens especially in poor dictatorships which cannot construct a reliable monitoring apparatus, as Egorov, Guriev, and Sonin show [ungated]), or he can use repression and risk being surprised by a lack of support later.

Here is where cults of personality come in handy. The dictator wants a credible signal of your support; merely staying silent and not saying anything negative won’t cut it. In order to be credible, the signal has to be costly: you have to be willing to say that the dictator is not merely ok, but a superhuman being, and you have to be willing to take some concrete actions showing your undying love for the leader. (You may have had this experience: you are served some food, and you must provide a credible signal that you like it so that the host will not be offended; merely saying that you like it will not cut it. So you will need to go for seconds and layer on the praise). Here the concrete action required of you is typically a willingness to denounce others when they fail to say the same thing, but it may also involve bizarre pilgrimages, ostentatious displays of the dictator’s image, etc. The cult of personality thus has three benefits from the point of view of the dictator (aside from stroking his vanity):

1.       When everybody lies about how wonderful the dictator is, there is no common knowledge: you do not know how much of this “support” is genuine and how much is not, which makes it hard to organize against the dictator and exposes one to risks, sometimes enormous risks, if one so much as tries to share one’s true views, since others can signal their commitment to the dictator by denouncing you. This is true of all mechanisms that induce preference falsification, however: they prevent coordination.
2.       What makes cults of personality interesting, however, is that the more baroque and over the top, the better (though the “over the top” level needs to be achieved by small steps), since differences in signals of commitment indicate gradations of personal support of the dictator, and hence give the dictator a reasonable measurement of his true level of support that is not easily available to the public. (Though you have to be willing to interpret these signals, and not come to actually believe them naively).
3.       Finally, a cult of personality can in fact transform some fraction of the population into genuine supporters, which may come in handy later. In a social world where everyone appears to be convinced of godlike status of the leader, it is very hard to “live in truth” as Havel and other dissidents in communist regimes argued.

To be sure, in order for a cult of personality to work, you must start small, and you must be willing to both reward (those who denounce) and punish (those who do not praise) with sufficient predictability, which presents a problem if control is initially lacking; there must be a group committed to enforcement at the beginning, and capable of slowly increasing the threshold “signal” of support required of citizens. (So some dictators fail at this: consider, e.g., Mobutu’s failures in this respect, partly from inability to monitor what was being said about him or to punish deviations with any certainty). But once the cult of personality is in full swing, it practically runs itself, turning every person into a sycophant and basically destroying everyone’s dignity in the process. It creates an equilibrium of lies that can be hard to disrupt unless people get a credible  signal that others basically hate the dictator as much as they do and are willing to do something about that.

There is a terrific story in Barbara Demick’s Nothing to Envy: Ordinary Lives in North Korea (pp. 97-101), which illustrates both how such control mechanisms can work regardless of belief and the degradation they inflict on people. The story is about a relatively privileged student, “Jun-sang,” at the time of the death of Kim Il-sung (North Korea’s “eternal president”). The death is announced, and Jun-sang finds that he cannot cry; he feels nothing for Kim Il-Sung. Yet, surrounded by his sobbing classmates, he suddenly realizes that “his entire future depended on his ability to cry: not just his career and his membership in the Workers’ Party, his very survival was at stake. It was a matter of life and death” (p. 98). So he forces himself to cry. And it gets worse: “What had started as a spontaneous outpouring of grief became a patriotic obligation … The inmiban [a neighbourhood committee] kept track of how often people went to the statue to show their respect. Everybody was being watched. They not only scrutinized actions, but facial expressions and tone of voice, gauging them for sincerity” (p. 101). The point of the story is not that nobody experienced any genuine grief at the death of Kim Il-sung (we cannot tell if Jun-sang’s feelings were common, or unusual) but that the expression of genuine grief was beside the point; all must give credible signals of grief or be considered suspect, and differences in these signals could be used to gauge the level of support (especially important at a time of leadership transition; Kim Il-sung had just died, and other people could have tried to take advantage of the opportunity if they had perceived any signals of wavering support from the population; note then the mobilization of the inmiban to monitor these signals). Moreover, the cult of personality induces a large degree of self-monitoring; there is no need to expend too many resources if others can be counted to note insufficiently credible signals of support and bring them to the attention of the authorities.  The only bright spot in all this is that dictators can become unmoored from reality - they come to believe their own propaganda - in which case they can be surprised by eruptions of protest (e.g., Ceausescu). 

18 comments:

  1. Might want to consider this as a precursor argument: http://journals.cambridge.org/action/displayAbstract?fromPage=online&aid=192911

    ReplyDelete
  2. Thanks David for the article link - that looks very interesting.

    ReplyDelete
  3. Anonymous8:31 AM

    Like when Mao died? The story in "Wild Swans" - everyone felt they had to cry or be suspect...

    ReplyDelete
  4. Thought experiment: suppose that instead of a single dictator, there is an elite class, which has general control over most cultural organs and elite institutions but whose political ascendancy is somewhat tenuous. Like the dictator, it will want signals of loyalty (both for determining loyalties and because of general ingroup/outgroup dynamics.) But because it’s not a single dictator, the signal can’t just be “the Generalissimo is awesome.” Instead, the shibboleth would be particular statements, or articles of faith, that are known to be supported by the ruling class.

    Of course, making the shibboleth a mathematical axiom would be pointless; if nothing else, non-loyal mathematicians would be misperceived as loyal elites. The signal must be costly. So the articles of faith must be faintly ridiculous; easily challengeable on the facts, so that only people really devoted would at first make those claims. (Of course they can’t be obviously false either, else the integrity of the ruling class would also be called into question.)

    Like cults of personality, there is a ratchet effect. As the elite becomes more entrenched, the shibboleths, while never changing in form (too costly to coordinate), will ever increase in intensity and audaciousness. And the enforcement of the party line will be strongest in educational institutions where the future members of the elite class are trained. It would be considered a matter of proper education – like the finishing schools of old – when “politically incorrect” thoughts in these institutions are mercilessly squelched.

    ReplyDelete
  5. AC, that's a very clear way of describing the process of building both a cult of personality and the ideological rituals of submission in "party" dictatorships. Something like the process you describe is visible in many communist dictatorships, with Marxism-Leninism providing the proper shibboleths. (But one can certainly imagine many other "languages" providing these shibboleths).

    ReplyDelete
  6. Chris B.11:19 PM

    In reading your excellent post I was reminded of the opening scene of King Lear, which demonstrates the type of praise that is earnest and inoffensive, but still not enough to please a given dictator (in Lear's case, the problem is not even that he is trying to establish a cult of personality--but I felt that the parallels to your discussion of the psychology of flattery still deserve mentioning). In response to Lear's demand that each of his daughters say who loves him most, only Cordelia provides a rational response: "I love your majesty / According to my bond; nor more nor less." The two other daughters, in contrast, say something to the effect of "I love you more than words can say"--and then subsequently offer many, many more expressions of love.

    In Cordelia's praise there is not even a hint of subversion. The problem is that in it there is also no inherent sacrifice, but in Regan's and Goneril's there is: a sacrifice of dignity when they so flagrantly defy logic. A response like Cordelia's normally ought not to upset a leader, but when it can be compared (unfavorably) to the level of sacrifice, and hence devotion, in another subject's praise, the Cordelias of the world are in trouble. This seems to me to be how a cult of personality might "start small" but at the same time initiate a snowballing system of one-upsmanship when it comes to public displays of support.

    http://shakespeare.mit.edu/lear/lear.1.1.html

    ReplyDelete
  7. That's a nice reference, Chris B.

    ReplyDelete
  8. When people are exposed to independent media, even when those independent sources of information do not report on dictator's faults and covers advanced democracies daily life people have a comparison basis and the cult portrait by state controlled media becomes offensive once people live different lives from what those state controlled news services say. I live it everyday here in Angola. Officially, the ruling party has won parliamentary polls (82%) in 2008 and now run the country for over 35 years under leadership of head of state who is president since 1979 and we all know that long serving executive leaders are not good neither for mental health nor finances of a country...

    ReplyDelete
  9. Anonymous3:20 AM

    wonderful post!

    ReplyDelete
  10. Anonymous3:26 AM

    "in order for a cult of personality to work, you must start small, and you must be willing to both reward (those who denounce) and punish (those who do not praise) with sufficient predictability, which presents a problem if control is initially lacking; there must be a group committed to enforcement at the beginning, and capable of slowly increasing the threshold “signal” of support required of citizens."

    Very fascinating. I would like to hear more about how a dictator secures legitimacy in his initial group.

    ReplyDelete
  11. Excellent post! Thank you for sharing it with us.

    One mechanism for securing legitimacy in the initial group is to kick over the rotten-seeming older system, whose supporters may be few.

    It helps if it is done militarily, because the system of ranks and command provides an automatic leg-up for the Great General.

    ReplyDelete
  12. Anonymous10:37 PM

    It would be interesting to see how this applies to corporations, particularly as their influence on governments increases.

    Expressions of loyalty that show sacrifice (for example, firing people on command) are certainly required in the upper levels of (at least some) large semi-regulated American companies.

    This screening process tends to keep corporate secrets secret, increases ease of coordination, and reduces whistleblowing behaviour.

    I have not seen the same screening process at the top levels of younger smaller companies, and do not know enough to comment on how the upper levels work in, say, an apple, microsoft, or intel.

    ReplyDelete
  13. Anonymous, I explore something like that in the China case in the post above ("Careerists and Ideologues").

    ReplyDelete
  14. Great post. May I suggest there is a further element, one general to propaganda in totalitarian societies in particular: crowding out. Part of the purpose is to literally leave no public space for critical views. So, not only are you signalling support by the rituals of the cult of personality, it excludes alternatives.

    Continuing AC's analysis of political correctness, one notes the rhetorical viciousness with which people who are critical of said markers of virtue are often treated. The point being to preserve said markers as signs of virtue (which clearly they are not if merely contestable opinion: so having "wrong" opinions clearly has to be a character defect) as well as punishing, and seek to crowd out, dissent.

    ReplyDelete
  15. Lorenzo, crowding out is certainly part of it. (This is partly Lisa Weeden's argument, in the paper noted by David in the first comment).

    ReplyDelete
  16. Anonymous3:18 PM

    Victor Shih has a related argument regarding "odious displays of loyalty" in China, published in the JOP.

    ReplyDelete
  17. Thanks Anonymous - that Victor Shih article looks very interesting.

    ReplyDelete
  18. Anonymous5:34 AM

    So interesting - having lived in Syria and Iraq I have been fascinated by the cults of personality there and as an archaeologist, most by their physical manifestations: Of course the huge statues and monuments, but also Saddam and Assad posters (found in every living room), Saddam watches, stencils of Assad's face on most concrete surfaces, velvet paintings of the dead Basil Assad in the hotel lobby, the shrine to Basel on the filing cabinet in the bank, decals of Hafez, Basel, and the Ba'ath Party dove on the rear windows of taxi cabs (the Syrian trinity!), Basel and Hafez Assad pictures on the awnings shading the vegetable market stands, the shoeshine boy with the Basel Assad tattoo,...

    ReplyDelete