Tuesday, January 20, 2009


When Passivity Fails


Sometimes to get an inside view you have to fake insiderism.

That was the situation faced by the authors of When Prophecy Fails, a book documenting the activities of a flying saucer doomsday cult in the 1950s. In the Methodological Appendix the authors discuss how the secrecy surrounding the cult – it was wary of nonbelievers – required them to work undercover instead of openly.

As the authors explained: “We tried to be nondirective, sympathetic listeners, passive participants who were inquisitive and eager to learn whatever the others might want to tell us.”

But the authors and others who served as observers were still influential.

For example, to gain entrĂ©e an observer had to pretend to have experienced a supernatural event. The group was receptive to those who had such experiences. But the problem was that the observer’s story served to reinforce the cult’s beliefs.

Two of the key figures in the cult were Marion Keech and Dr. Armstrong (not their actual names). Dr. Armstrong was a college professor who held meetings about Mrs. Keech and her prophecies.

A weird dream story was concocted for one observer in the study so she could enter Dr. Armstrong’s circle. The “dream” was tailored to fit the predication of a catastrophic flood soon to occur. Since the story “proved” that there would be a devastating flood, the rest of the prophecy – flying saucers would arrive to save the chosen – had to be true. The observer was asked to repeat the story to other members and also to tape-record it so that her narrative could be sent out to believers in other locations.

That observer and three others joined the cult within ten days. As the authors note, just having four new members join a small group in relatively short time had an impact by itself, reinforcing its belief system.

Despite the attempts for “passive participation,” observers still influenced the group. One observer showed up on Christmas Day and this visit was interpreted by Mrs. Keech that he was a spaceman.

On another occasion one of the authors was commanded by Mrs. Keech to lead a meeting. The author, concerned by maintaining neutrality, said that everyone should just meditate.

That seemed to be a safe bet until one member, Bertha, fell into a trance and started channeling messages. By doing next to nothing he allowed her to take the floor. This member became a factor in the group’s power structure, offering a different viewpoint.

At least the observers tried to not to influence events. But what if they hadn’t been so concerned? Tricksters or agents provocateurs would’ve had a field day.

I know if I had been one of the observers in the study, I would’ve struggled with controlling the jokester in me.


4 comments:

Doug said...

So, uh, I guess... don't quit your day job, Ray. Or at least not for trying to infiltrate groups undercover.

A man's gotta know his limitations, or something.

Ray said...

Doug:

This blog is my "day job."

I wonder if there's ever been a case where the group leader tried to gather true believers to his cause and ended up with only undercover sociologists, writers, reporters, cops, etc. trying to get the inside story, people who really didn't buy his vision. A cult leader surrounded only by infiltrators.

Ray

X. Dell said...

Ray, I can only imagine what the "jokester" in you might have done in this scenario. It is tempting, isn't it.

The problem with joining any cult--even in pretend--for research is that you become part of the cult, and therefore commit experimenter bias. i would suspect that the researchers understood this when they went in, hence the discussion of methodology. Still, this puts us behind the eight-ball, for their presence could have had far more impact than they themselves might have known about. Moreover, there's the chance that their observations could have been informed by the group, so to speak (actually, I would bet that they had come under the group's influence, at least to a minimal degree).

That doesn't necessarily negate all of their findings, especially some that might not have been known otherwise. Problem is, how much can we trust each individual finding knowing that there is bias somewhere, but not knowing where it could lay or show itself.

Ray said...

X. Dell:

I'm now reading the book, "Imaginary Friends," a fictional take by Alison Lurie on the "When Prophecy Fails" event. It does show an awareness of the points you've raised.

I wonder how much "moral ground" a sociologist would have regarding a jokester throwing some nonsense into the mix with a similar doomsday/UFO cult. As mentioned in WPF, even by doing nothing an observer was influencing events. It would just a matter of degree for a trickster to knowingly add a bit of craziness, especially when the group was based on nonsense anyway.

After all, the sociological observer would still have a valid study, albeit a different one: how a jokester can gain acceptance in a group and cynically manipulate its gullible members.

Ray