My comments in [brackets] are called out in dark red to discriminate them more easily from the original text by Richard J. Ofshe, Ph.D., in Borgata & Montgomery: Encyclopedia of Sociology, Volume 1, New York: Macmillan, 2000. I am not clear as to whether or not the original article is in the public domain; it was published in full at the link shown above on the CultEducation.com website. Before we move into Ofshe's text, it seems useful to cite Michael Langone's "checklist of cult characteristics" as reprinted in Tobias & Lalich (1994):
1) The group is focused on a living leader to whom members seem to display excessively zealous, unquestioning commitment.
2) The group is preoccupied with bringing in new members.
3) The group is preoccupied with making money.
4) Questioning, doubt, and dissent are discouraged or even punished.
5) Mind-numbing techniques (such as meditation, chanting, speaking in tongues, denunciation sessions, debilitating work routines) are used to suppress doubts about the group and its leaders.
6) The leadership dictates -- sometimes in great detail -- how members should think, act, and feel (e.g.: members must get permission from leaders to date, change jobs, get married; leaders may prescribe what types of clothes to wear, where to live, how to discipline children, and so forth).
7) The group is elitist, claiming a special exalted status for itself, its leader(s) and members (e.g.: the leader is considered the messiah or an avatar; the group and/or the leader has a special mission to save humanity).
8) The group has a polarized us-versus-them mentality, which causes conflict with the wider society.
9) The group's leader is not accountable to any authorities (as are, for example, military commanders and ministers, priests, monks, and rabbis of mainstream denominations). The group teaches or implies that its supposedly exalted ends justify means that members would have considered unethical before joining the group (e.g.: collecting money for bogus charities).
10) The leadership induces feelings of guilt [embarrassment, humiliation, and shame, as well as self-doubt and anxiety] in members in order to control them.
11) Members subservience to the group causes them to cut ties with family, friends, and personal group goals and activities that were of interest before joining the group.
12) Members are expected to devote inordinate amounts of time to the group.
13) Members are encouraged or required to live and/or socialize only with other group members.
1) The group is focused on a living leader to whom members seem to display excessively zealous, unquestioning commitment.
2) The group is preoccupied with bringing in new members.
3) The group is preoccupied with making money.
4) Questioning, doubt, and dissent are discouraged or even punished.
5) Mind-numbing techniques (such as meditation, chanting, speaking in tongues, denunciation sessions, debilitating work routines) are used to suppress doubts about the group and its leaders.
6) The leadership dictates -- sometimes in great detail -- how members should think, act, and feel (e.g.: members must get permission from leaders to date, change jobs, get married; leaders may prescribe what types of clothes to wear, where to live, how to discipline children, and so forth).
7) The group is elitist, claiming a special exalted status for itself, its leader(s) and members (e.g.: the leader is considered the messiah or an avatar; the group and/or the leader has a special mission to save humanity).
8) The group has a polarized us-versus-them mentality, which causes conflict with the wider society.
9) The group's leader is not accountable to any authorities (as are, for example, military commanders and ministers, priests, monks, and rabbis of mainstream denominations). The group teaches or implies that its supposedly exalted ends justify means that members would have considered unethical before joining the group (e.g.: collecting money for bogus charities).
10) The leadership induces feelings of guilt [embarrassment, humiliation, and shame, as well as self-doubt and anxiety] in members in order to control them.
11) Members subservience to the group causes them to cut ties with family, friends, and personal group goals and activities that were of interest before joining the group.
12) Members are expected to devote inordinate amounts of time to the group.
13) Members are encouraged or required to live and/or socialize only with other group members.
Coercive Persuasion and Attitude Change
Coercive persuasion and thought reform are alternate names for programs of social influence capable of producing substantial behavior and attitude change through the use of coercive tactics, persuasion, and/or interpersonal and group-based influence manipulations [group dynamics via peers and authority figures] (Schein 1961; Lifton 1961). Such programs have also been labeled "brainwashing" (Hunter 1951), a term more often used in the media than in scientific literature. However identified, these programs are distinguishable from other elaborate attempts to influence behavior and attitudes, to socialize, and to accomplish social control. Their distinguishing features are their totalistic qualities (Lifton 1961), the types of influence procedures they employ, and the organization of these procedures into three distinctive subphases of the overall process (Schein 1961; Ofshe and Singer 1986). The key factors that distinguish coercive persuasion from other training and socialization schemes are:
1. The reliance on intense interpersonal and psychological
attack to destabilize an individual's sense of self to promote compliance
2. The use of an organized peer group
3. Applying interpersonal pressure to promote conformity
4. The manipulation of the totality of the person's social
environment to stabilize behavior once modified
[Thus, four techniques used in the more aggressive forms
of substance abuse treatment programs (SATPs)... leaving plenty of room for
abuse of such by cynical instrumentalists from various mind-control cults. The
same techniques are used in authoritarian families (AFs), military training,
multi-level marketing schemes (MLMs), as well as in morally perfectionistic
& evangelical charismatic religious cults (ECRCs), large-group awareness
trainings (LGATs) and human potential cults (HPCs).]
Thought-reform programs have been employed in attempts to
control and indoctrinate individuals, societal groups (e.g., intellectuals),
and even entire populations. Systems intended to accomplish these goals can
vary considerably in their construction. Even the first systems studied under
the label "thought reform" ranged from those in which confinement and
physical assault were employed (Schein 1956; Lifton 1954; Lifton 1961 pp.
19-85) to applications that were carried out under nonconfined conditions, in
which nonphysical coercion substituted for assault (Lifton 1961, pp. 242-273;
Schein 1961, pp. 290-298). The individuals to whom these influence programs
were applied were in some cases unwilling subjects (prisoner populations) and
in other cases volunteers who sought to participate in what they believed might
be a career-beneficial, educational experience (Lifton 1981, p. 248).
Significant differences existed between the social
environments and the control mechanisms employed in the two types of programs
initially studied. Their similarities, however, are of more importance in
understanding their ability to influence behavior and beliefs than are their
differences. They shared the utilization of coercive persuasion's key, effective influence mechanisms:
1. a focused attack on the stability of a person's sense
of self;
2. reliance on peer group interaction;
3. the development of interpersonal bonds between targets
and their controllers and peers; and
4. an ability to control communication among participants.
Edgar Schein captured the essential similarity between the
types of programs in his definition of the coercive-persuasion phenomenon.
Schein noted that even for prisoners, what happened was a subjection to
"unusually intense and prolonged persuasion" [as in the LGATs of the
'70s and '80s] that they could not avoid; thus, "they were coerced into
allowing themselves to be persuaded" (Schein 1961, p. 18).
Programs of both types (confined/assaultive and
non-confined/non-assaultive) cause a range of cognitive and behavioral
responses. The reported cognitive responses vary from apparently rare
instances, classifiable as internalized belief change (enduring change), to a
frequently observed transient alteration in beliefs that appears to be
situationally adaptive and, finally, to reactions of nothing less than firm
intellectual resistance and hostility (Lifton 1961, pp. 117-151, 399-415;
Schein 1961, pp. 157-166).
The phrase situationally adaptive belief change refers to
attitude change that is not stable and is environment dependent. This type of
response to the influence pressures of coercive-persuasion programs is perhaps
the most surprising of the responses that have been observed. The combination
of psychological assault on the self, interpersonal pressure, and the social
organization of the environment creates a situation that can only be coped with
by adapting and acting so as to present oneself to others in terms of the
ideology supported in the environment (see below for discussion). [This is
precisely what I saw going on across all six types of organizations:
1. Karpman Drama Triangle AFs vs. neglecting, nurturing,
or authoritative families as defined by Baumrind and where the objective of continued membership is a dependency upon support and protection;
2. SATPs like the various 12 Step programs (see Wilson),
and commercial operations like Hazelden-Betty Ford, the Behavioral Medicine
Center, and The Meadows (for the "better," mostly) where the objective of continued membership is escape from domination by addiction to a chemical substance or supposedly protective behavior;
3. MLMs like Amway, Herbalife, Mary Kay Cosmetics, Nu
Skin, Primerica, Shaklee and World Financial where the objective of continued membership is financial enrichment and escape from anxiety about economic threat;
4. ECRCs like the old-schools Pentecostals, Assemblies of
God, Calvary Chapel, Church of God, The Rock, and the Jesus Army where the objective of continued membership is group support and relief of anxiety about death, dying and physical incapacity;
5. LGATs (including those used for "corporate culture
in-doctrine-ation") like Leadership Dynamics, Lifespring, PSI, est, The
Forum, Landmark Education, Tony Robbins Seminars, and Benchmark (which may be a
CoS development) where the objective of continued membership is self-discovery, relief of emotional suffering and/or ego empowerment for career and social advancement; and
6. HPCs like the Hare Krishnas, the Moonies, Silva Mind
Control, Eckankar, the Center for Feeling Therapy and the CoS where the objective of continued membership is similar to the objective in the LGATs, but is removed from -- and sometimes even rejective of -- mainstream cultural identification.
See also A Dozen-plus Categories of Cults.
In the KDT AF, the personality orientation (and possibly disorder) of the member -- not the guru, priest or dominator -- is usually akin to the "cooperative / dependent" in Millon's taxonomy. In the SATPs, the personality orientation varies considerably, but is most commonly at least somewhat "non-conforming / antisocial." In MLM's, the PO is more typically "conscientious / compulsive" and/or "confidant / narcissistic." In the ECRC's, the PO is overwhelmingly "cooperative / dependent." In the LGAT's, it is quite varied but tends toward "aggrieved / masochistic" and "conscientious / compulsive." In the HPCs, one sees a lot of the same sort of underlying personality structures as in the LGATs, but with a surface or masque of sociable / histrionic and eccentric / schizotypal manifestations.]
In the KDT AF, the personality orientation (and possibly disorder) of the member -- not the guru, priest or dominator -- is usually akin to the "cooperative / dependent" in Millon's taxonomy. In the SATPs, the personality orientation varies considerably, but is most commonly at least somewhat "non-conforming / antisocial." In MLM's, the PO is more typically "conscientious / compulsive" and/or "confidant / narcissistic." In the ECRC's, the PO is overwhelmingly "cooperative / dependent." In the LGAT's, it is quite varied but tends toward "aggrieved / masochistic" and "conscientious / compulsive." In the HPCs, one sees a lot of the same sort of underlying personality structures as in the LGATs, but with a surface or masque of sociable / histrionic and eccentric / schizotypal manifestations.]
Eliciting the desired verbal and interactive behavior sets
up conditions likely to stimulate the development of attitudes consistent with
and that function to rationalize new behavior in which the individual is
engaging. Models of attitude change, such as the theory of Cognitive Dissonance
(Festinger 1957) or Self-Perception Theory (Bern 1972), explain the tendency for
consistent attitudes to develop as a consequence of behavior.
The surprising aspect of the situationally adaptive
response is that the attitudes that develop are unstable. They tend to change
dramatically once the person is removed from an environment that has totalistic
properties and is organized to support the adaptive attitudes. Once removed
from such an environment, the person is able to interact with others who permit
and encourage the expression of criticisms and doubts, which were previously
stifled because of the normative rules of the reform environment (Schein 1961,
p. 163; Lifton 1961, pp. 87-116, 399-415; Ofshe and Singer 1986). This pattern
of change, first in one direction and then the other, dramatically highlights
the profound importance of social support in the explanation of attitude change
and stability. This relationship has for decades been one of the principal
interests in the field of social psychology.
Statements supportive of the proffered ideology that
indicate adaptive attitude change during the period of the target's involvement
in the reform environment and immediately following separation should not be
taken as mere playacting in reaction to necessity. Targets tend to become
genuinely involved in the interaction. The reform experience focuses on genuine
vulnerabilities [from having been in-struct-ed (by the social con-struct-ion of reality, as per Berger & Luckman), in-doctrine-ated, conditioned, programmed, socialized, habituated and normalized to
a) belief and rule-following rather than observation and
conscious choice,
b) in Tart's consensus trance,
c) Bowlby's anxious attachment, and
d) (not necessarily sexual) sado-masochistic
e) submission to more "competent" authority
f) and dominance of less "competent" others,
g) usually by means of a variable schedule of reinforcement
h) of rewards and punishments
i) to a near-permanent state of interpersonal codependence on the
j) Karpman Drama Triangle for
the sake of "functional" (which depends upon who think so) societal organization]
as the method for undermining self-concept: manipulating
genuine feelings of guilt about past conduct; inducing the target to make
public denunciations of his or her prior life as being unworthy; and carrying
this forward through interaction with peers for whom the target develops strong
bonds [precisely as I witnessed first-hand in AFs, as well as the SATPs, the
LGATs and the HPCs]. Involvement developed in these ways prevents the target
from maintaining both psychological distance or emotional independence from the
experience.
The reaction pattern of persons who display adaptive
attitude-change responses is not one of an immediate and easy rejection of the
proffered ideology. This response would be expected if they had been faking
their reactions as a conscious strategy to defend against the pressures to
which they were exposed. Rather, they appear to be conflicted about the
sentiments [actually instructed beliefs] they developed and their reevaluation
of these sentiments. This response has been observed in persons reformed under
both confined / assaultive and nonconfined / nonassaultive reform conditions
(Schein 1962, pp. 163- 165; Lifton 1961, pp. 86-116, 400- 401).
Self-concept and belief-related attitude change in
response to closely controlled social environments have been observed in other
organizational settings that, like reform programs, can be classified as total
institutions (Goffman 1957). Thought-reform reactions also appear to be related
to, but are far more extreme than, responses to the typically
less-identity-assaultive and less-totalistic socialization programs carried out
by organizations with central commitments to specifiable ideologies, and which
undertake the training of social roles (e.g., in military academies [and
"boot camp" basic training, "sales motivation," corporate
"culture in-doctrine-ation," severely AFs, HPCs, LGATs, MLMs and
ECRCs] and religious-indoctrination settings [including monasteries of certain
types] (Donbush 1955; Hulme 1956).
The relatively rare instances in which belief changes are
internalized and endure have been analyzed as attributable to the degree to
which the acquired belief system and imposed peer relations function fully to
resolve the identity crisis that is routinely precipitated during the first
phase of the reform process [think "Marine Corps," "championship
football team," and "sales department," (see Cialdini)] (Schein
1961, p. 164; Lifton 1961, pp. 131-132, 400). Whatever the explanation for why
some persons internalize the proffered ideology in response to the reform procedures,
this extreme reaction should be recognized as both atypical and probably
attributable to an interaction between long-standing personality traits
[including dominance-and-submission-brand, codependent authoritarianism (see Garrett, Mellody, Schaef, and Weinhold & Weinhold) conferred in the family of
origin, as well as at public school] and the mechanisms of influence utilized
during the reform process.
Much of the attention to reform programs was stimulated
because it was suspected that a predictable and highly effective method for
profoundly changing beliefs had been designed, implemented, and was in
operation. These suspicions are not supported by fact. Programs identified as
thought reforming are not very effective at actually changing people's beliefs
in any fashion that endures apart from an elaborate supporting social context.
[Which supports my observation that most if not all of the cult members and
exiters I have encountered came from (think "AF") family and school
(and sometimes workplace) environments that in-struct-ed, programmed,
socialized and normalized them to unconscious authoritarianism (see Adorno, Altemeyer, Arendt, Baumrind, and Garrett) long before they
ever walked through the auditorium, church or assembly room door. Wiking
Germanism (yah; mein fuehrer! (see Meerloo), albeit at a less observable level)
is the glue that holds this cult-ure together.] Evaluated only on the criterion
of their ability genuinely to change beliefs, the programs have to be judged
abject failures and massive wastes of effort.
The programs are, however, impressive in their ability to
prepare targets for integration into and long-term participation in the
organizations that operate them. Rather than assuming that individual belief
change is the major goal of these programs, it is perhaps more productive to
view the programs as elaborate role-training regimes [which makes them little
different from any form of schooling, save for the fact that they are usually
much more efficient]. That is, as re-socialization programs in which targets
are being prepared to conduct themselves in a fashion appropriate for the
social roles they are expected to occupy following conclusion of the training
process [e.g.: to be good little producers, consumers and defenders of the
cult's wealth].
If identified as training programs, it is clear that the
goals of such programs are to reshape behavior and that they are organized
around issues of social control important to the organizations [and leaders,
gurus, sales directors, ministers, priests, rabbis, imams, generals, admirals,
etc., thereof] that operate the programs. Their objectives then appear to be
behavioral training of the target, which result in an ability to present self,
values, aspirations, and past history in a style appropriate to the ideology of
the controlling organization; to train an ability to reason in terms of the
ideology; and to train a willingness to accept direction from those in
authority with minimum apparent resistance [Which is all precisely what I
observed in the AFs, SATPs, LGATs, ECRCs, and HPCs in which I participated or
infiltrated (I have never been part of an MLM, but have known several people
who were)]. Belief changes that follow from successfully coercing or inducing
the person to behave in the prescribed manner can be thought of as by-products
of the training experience. As attitude- change models would predict, they
arise "naturally" as a result of efforts to reshape behavior
(Festinger 1957; Bem 1972). [In fact, the parent, the leader, the CEO, the head
coach, the guru, the sales director, the minister, the priest, the rabbi, the
imam, the general, the admiral, etc., who is a cynical sociopath doesn't care
at all what the "little people" believe or don't believe, so long as
they conform and perform.]
The tactical dimension most clearly distinguishing reform
processes from other sorts of training programs is the reliance on
psychological coercion: procedures that generate pressure to comply as a means
of escaping a punishing experience (e.g., public humiliation, sleep
deprivation, guilt manipulation, etc.) [precisely as Watson and Skinner
described as one of the bedrock concepts of behaviorism and "behavior modification"]. Coercion differs from other influencing factors also
present in thought reform, such as content-based persuasive attempts (e.g.,
presentation of new information, reference to authorities, etc.) or reliance on
influence variables operative in all interaction (status relations, demeanor,
normal assertiveness differentials, etc.) [I saw all of this in the AFs, SATPs,
LGATs, ECRCs, and HPCs in which I participated or at least observed directly]. Coercion is
principally utilized to gain behavioral [...submission and...] compliance at
key points and to ensure participation in activities likely to have influencing
[in-flow-encing; flues are channels for liquid flow] effects; that is, to
engage the person in the role training activities and in procedures likely to
lead to strong emotional responses, to cognitive confusion, or to attributions
to self as the source of beliefs [turning Weiner upside down] promoted during
the process.
Robert Lifton labeled the extraordinarily high degree of
social control characteristic of organizations that operate reform programs as
their totalistic quality (Lifton 1961). This concept refers to the mobilization
of the entirety of the person's social, and often physical, environment in
support of the manipulative effort [think "six weeks... at Paris
Island," "28 days in the residential treatment center,"
"three nights a week in the sanctuary plus social activities in the
community room," "250 people in a hotel ballroom for four sessions of
15 hours each"]. Lifton identified eight themes or properties of reform
environments that contribute to their totalistic quality:
1. Control of communication
2. Emotional and behavioral manipulation
3. Demands for absolute conformity to behavior
prescriptions derived from the ideology
4. Obsessive demands for confession
5. Agreement that the ideology is faultless
6. Manipulation of language in which cliches substitute
for analytic thought
7. Reinterpretation of human experience and emotion in
terms of doctrine
8. Classification of those not sharing the ideology as
inferior and not worthy of respect
(Lifton 1961, pp. 419-437, 1987). [Though I did not see
this first hand in any MLM trainings, I could infer it from the attitudes and
behaviors of the participants. But I saw and heard every one of the
authoritarian families (AFs), substance abuse treatment programs (SATPs), large
group awareness training (LGATs), evangelical and/or charismatic religious cults
(ECRCs), and human potential cults (HPCs) in which I participated or
directly observed. Some of the eight were more subtle in some cases (as in the SATPs
and ECRCs); some were very obvious (as in the LGATs and HPCs). I also observed
all -- save for item four -- in both basic military and officer training
schools. ... What strikes me as perplexing is the absence of any mention of the
alternating use of subtle-to-obvious affection, approval, "boat
floating," "ego stroking," "love bombing" and
"rescuing" (as on the Karpman Drama Triangle) vs. equally
subtle-to-obvious discounting, disclaiming, criticizing, embarrassing,
humiliating, blaming, demonizing, punishing and "persecuting" (as on
that same KDT) I saw again and again in all of the AFs, LGATs, ECRCs, HPCs and corporate
semi-cults to which I was witness, as well as what I heard about second hand
from the members or former members of several MLMs. (I did not see this
confusing, "crazy-making," "attachment splitting" (see
Bowlby, Cassidy & Shaver, and Shaver, and Garrett) behavior in any SATP, but did observe
it in the military during the Vietnam era, as well as in the realm of political
organizations -- including a statewide political offshoot of one of the big,
West Coast LGATs -- in the '00s.)]
Schein's analysis of the behavioral sequence underlying
coercive persuasion separated the process into three subphases: unfreezing,
change, and refreezing (Schein 1961, pp. 111-139). Phases differ in their
principal goals and their admixtures of persuasive, influencing, and coercive tactics.
Although others have described the process differently, their analyses are not
inconsistent with Schein's three-phase breakdown (Lifton 1961; Farber, Harlow,
and West 1956; Meerloo 1956; Sargent 1957; Ofshe and Singer 1986). Although
Schein's terminology is adopted here, the descriptions of phase activities have
been broadened to reflect later research.
Unfreezing is the first step in eliciting behavior and
developing a belief system that facilitates the long-term management of a
person. It consists of attempting to undercut a person's psychological basis
for resisting demands for behavioral compliance to the routines and rituals of
the reform program. The goals of unfreezing are to destabilize a person's sense
of identity (i.e., to precipitate an identity crisis), to diminish confidence
in prior social judgments, and to foster a sense of powerlessness, if not
hopelessness [and, thus, Seligman's "learned helplessness" precisely
as though they had been put in one of those rat boxes]. Successful destabilization
induces a negative shift in global self evaluations [destroying Branden's
"self-esteem" (no wonder he was so hip to what the LGATs and HPCs
were doing in the '70s); and Erikson's achievement of "identity,"
however "foreclosed" (as per Marcia) or "stable but
evolving" it may have been when they surrendered to The Master] and
increases uncertainty about one's values and position in society. It thereby
reduces resistance to the new demands for compliance while increasing
suggestibility.
Destabilization of identity [see above] is accomplished by
bringing into play varying sets of manipulative techniques. The first programs
to be studied utilized techniques such as repeatedly demonstrating the person's
inability to control his or her own fate, the use of degradation ceremonies,
attempts to induce reevaluation of the adequacy and/or propriety of prior
conduct [very much as is done in the 12 Step SAPs, which -- though mostly
ethical -- can go south in the hands of a cynical sociopath, cult recruiter, or
(as I saw from the '80s to the mid-'00s, one slave labor AA sponsor or
another)], and techniques designed to encourage the reemergence of
[unprocessed, un-"digested"] latent feelings of guilt and emotional
turmoil [as is so often seen in the authoritarian, rescuing (infantilizing) and
persecuting (victimizing),
codependence-inducing (see Mellody, Schaef, and Weinhold & Weinhold), Karpman Drama Triangle families in most therapy rooms]
(Hinkle and Wolfe 1956; Lifton 1954, 1961; Schein 1956, 1961; Schein, Cooley,
and Singer 1960). Contemporary programs [including the high-tech iterations of
the LGATs and HPCs, and even some of the ECRCs and military "special
forces"] have been observed to utilize far more psychologically
sophisticated procedures to accomplish destabilization. These techniques are
often adapted from the traditions of psychiatry, psychotherapy, hypnotherapy,
and the human-potential movement, as well as from religious practice [sadly
including subtle corruptions of Buddhist meditation and Catholic confession (see Batchelor, and Fronsdal)]
(Ofshe and Singer 1986; Lifton 1987).
The change phase allows the individual an opportunity to
escape punishing destabilization procedures by demonstrating that he or she has
learned the proffered ideology, can demonstrate an ability to interpret reality
in its own terms, and is willing to participate in competition with peers to
demonstrate zeal, through displays of commitment. [What is any different here
from the dynamics of the rescuing-infantilizing here / persecuting-victimizing
there, Karpman Drama Triangle dynamics of the typical AF?] In addition to
study and/or formal instruction, the techniques used to facilitate learning and
the skill basis that can lead to opinion change include scheduling events that
have predictable influencing consequences, rewarding certain conduct, and manipulating emotions to create punishing experiences[, much as is done by the
intimidating-style nun, smugly self-righteous college professor, guru, military
drill instructor or sports coach].
Some of the practices designed to promote influence might
include requiring the target to assume responsibility for the progress of
less-advanced "students" to become the responsibility of those
further along in the program, to assume the role of a teacher of the ideology,
or to develop ever more refined and detailed confession statements that recast
the person's former life in terms of the required ideological position [all as
I witnessed in the more assertive 12 Step SATPs and HPCs].
Group structure is often manipulated by making rewards or
punishments for an entire peer group contingent on the performance of the
weakest person, requiring the group to utilize a vocabulary appropriate to the
ideology, making status and privilege changes commensurate with behavioral
compliance, subjecting the target to strong criticism and humiliation from
peers for lack of progress, and peer monitoring for expressions of reservations
or dissent [much as I observed in basic and advanced military training, as well
as in one of the bigger HPCs]. If progress is unsatisfactory, the individual
can again be subjected to the punishing destabilization procedures used during
unfreezing to undermine identity, to humiliate, and to provoke feelings of
shame and guilt [all of which are straight out of Forward's Emotional Blackmail].
Refreezing denotes an attempt to promote and reinforce [as
per Watson, Skinner, Bandura, Hayes, et al] behavior acceptable to the
controlling organization. Satisfactory performance is rewarded [as per Watson,
Skinner, Bandura, Hayes, et al] with social approval, status gains, and small
privileges [all of which I have seen used in Karpman Drama Triangle AFs, MLMs,
SATPs, HPCs, ECRCs, egregiously stressful corporate structures, and military --
and quasi-military -- organizations]. Part of the social structure of the
environment is the norm of interpreting the target's display of the desired
conduct as demonstrating the person's progress in understanding the errors of
his or her former life. The combination of reinforcing approved behavior and
interpreting its symbolic meaning as demonstrating the emergence of a new
individual fosters the development of an environment-specific, supposedly
reborn social identity. The person is encouraged to claim this [twisted]
identity and is rewarded for doing so.
[For one who understands the dynamics of codependence (see Mellody, Schaef and Weinhold & Weinhold) in
almost microscopic detail after 26 years in a 12 Step program therefore, the last two sentences -- and indeed, the three
sections on unfreezing, changing and refreezing -- are near flawless
descriptions of how the authoritarian family (AF) parenting style (see Baumrind) in-struct-s,
in-doctrine-ates, conditions, socializes, habituates, acculturates, accustoms,
normalizes and institutionalizes codependence in this cult-ure.]
Lengthy participation in an appropriately constructed [as
per Burrow's The Social Basis of Consciousness, and Berger & Luckman's The
Social Construction of Reality] and managed environment fosters peer relations,
an interaction history, and other behavior consistent with a public identity
that incorporates approved values and opinions. Promoting the development of an
interaction history in which persons engage in cooperative activity with peers
that is not blatantly coerced and in which they are encouraged but not forced
to make verbal claims to "truly understanding the ideology and having been
transformed," will tend to lead them to conclude that they hold beliefs
consistent with their actions (i.e., to make attributions to self as the source
of their behaviors). These reinforcement procedures can result in a significant
degree of cognitive confusion [or "cognitive dissonance," as per
Festinger] and an alteration in what the person takes to be his or her beliefs
and attitudes while involved in the controlled environment (Bem 1972; Ofshe et
al. 1974) [again; this is precisely what I observed in the populations of the
AFs, SATPs, ECRCs, HPCs, LGATs and corporate semi-cults (because some of
Lifton's characteristics were not apparent) I joined, worked with or
infiltrated over the course of five decades].
Continuous use of refreezing procedures can sustain the
expression of what appears to be significant attitude change for long periods
of time [long enough in some cases I have observed to physically sicken and at
least temporarily psychotize the "willing" (they won't leave)
participants; such is clearly the case at a large, almost "slave
labor" compound behind the high walls of an old resort about 70 miles east
of Los Angeles]. Maintaining compliance with a requirement that the person
display behavior signifying unreserved acceptance of an imposed ideology and
gaining other forms of long-term behavioral control requires continuous effort.
The person must be carefully managed, monitored, and manipulated through peer
pressure, the threat or use of punishment (material, social, and emotional) and
through the normative rules of the community (e.g., expectations prohibiting
careers independent of the organization, prohibiting formation of independent
nuclear families, prohibiting accumulation of significant personal economic
resources, etc.) (Whyte 1976; Ofshe 1980; Ofshe and Singer 1986). [All as
witnessed first-hand, as well as described by Flo Conway & Jim
Siegelman, Philip Cushman, Arthur Deikman, Mark Galanter, Sam Harris, Steven Hassan, Joel
Kramer & Diana Alstad, Michael Langone, Margaret Thaler Singer, Lawrence
Wright, Irwin Yalom and a long list of others, including all the recent tell-alls on the
CoS.]
The rate at which a once-attained level of attitude change
deteriorates depends on the type of social support the person receives over
time (Schein 1961 pp. 158-166; Lifton pp. 399-415). In keeping with the
refreezing metaphor, even when the reform process is to some degree successful
at shaping behavior and attitudes, the new shape tends to be maintained only as
long as temperature is appropriately controlled.
One of the essential components of the reform process in
general and of long-term refreezing in particular is monitoring and limiting
the content of communication among persons in the managed group [much as any
self-respecting, totalitarian government worth its reputation has tried to do
since Hammurabi's dog was a pup 4000 years ago; "Red" China, Islamist
Iran and North Korea being the most recent examples] (Lifton 1961; Schein 1960;
Ofshe et al. ] 974). If successfully accomplished, communication control
eliminates a person's ability safely to express criticisms or to share private
doubts and reservations. The result is to confer on the community the quality
of being a spy system of the whole, upon the whole. [This particular dynamic is
what investigators like Theodore Lidz, Gregory Bateson, Paul Watzlawick, Don D.
Jackson, Jay Haley, Ronald D. Laing, Aaron Esterson, Jules Henry and Eric
Bermann saw in many -- though not all -- of the "schizophrenogenic"
and otherwise "crazy-making," extreme authoritarian families they
observed from the late 1940s to early 1970s, and reported in tattered books now
sold on amazon.com at often precious prices.]
The typically observed complex of
communication-controlling rules requires people to self-report critical
thoughts to authorities or to make doubts known only in approved and readily
managed settings (e.g., small groups or private counseling sessions). Admitting
"negativity" leads to punishment or re-in[-doctrine-]ation through
procedures sometimes euphemistically termed "education" or
"therapy." Individual social isolation is furthered by rules
requiring peers to "help" colleagues to progress, by reporting their
expressions of doubt. If it is discovered, failure to make a report is
punishable, because it reflects on the low level of commitment of the person
who did not "help" a colleague to make progress.
Controlling communication effectively blocks individuals
from testing the appropriateness of privately held critical perceptions against
the views of even their families and most-valued associates. Community norms
[based on the norms of an AF?] encourage doubters to [mis-]interpret lingering
reservations as signs of a personal failure to comprehend the [supposed] truth
of the ideology; if involved with religious organizations, to [mis-]interpret
doubt as evidence of sinfulness [One may, with considerable edification about
"sin," refer to the late Lawrence Kohlberg's bedrock psych school
text on the six levels of moral interpretation. Most AFs, as well as some
SATPs, HPCs and LGATs, and all ECRCs, operate -- albeit often selectively -- at
the belief-based lower four, with ardent denial of the empirically grounded
upper two. And, in fact, it is the essence of all cults that empirical evidence
for any critique of the cult's ideology is hogwash. These people are Eric
Hoffer's True Believers; period, the end.] or the result of demonic
influences; if involved with an organization delivering a supposed
psychological or medical therapy, as evidence of continuing illness and/or
failure to progress in treatment.
The significance of communication control is illustrated
by the collapse of a large psychotherapy organization in immediate reaction to
the leadership's loss of effective control over interpersonal communication. At
a meeting of several hundred of the members of this "therapeutic
community" clients were allowed openly to voice privately held
reservations about their treatment and exploitation. They had been subjected to
abusive practices, which included assault, sexual and economic exploitation,
extremes of public humiliation, and others. When members discovered the extent
to which their sentiments about these practices were shared by their peers they
rebelled (Ayalla 1985). [I'm not willing to shove a Jackson out onto the table,
but I am pretty sure this refers to the Synanon SATP in Santa Monica, CA, in
the 1960s and '70s.]
Two widespread myths have developed from misreading the
early studies of thought reforming influence systems (Zablocki 1991 ). These
studies dealt in part with their use to elicit false confessions in the Soviet
Union after the 1917 revolution; from American and United Nations forces held
as POWs during the Korean War; and from their application to Western
missionaries held in China following Mao's revolution.
The first myth concerns the necessity and effectiveness of
physical abuse in the reform process. The myth is that physical abuse is not
only necessary but is the prime cause of apparent belief change. Reports about
the treatment of POWs and foreign prisoners in China documented that physical
abuse was present. Studies of the role of assault in the promotion of attitude
change and in eliciting false confessions even from U.S. servicemen revealed,
however, that it was ineffective. Belief change and compliance was more likely
when physical abuse was minimal or absent (Biderman 1960). Both Schein (1961)
and Lifton (1961) reported that physical abuse was a minor element in the
theoretical understanding of even prison reform programs in China.
In the main, efforts at resocializing China's nationals
were conducted under nonconfined / nonassaultive conditions. Millions of
China's citizens underwent reform in schools, special-training centers,
factories, and neighborhood groups in which physical assault was not used as a
coercive technique. One such setting for which many participants actively
sought admission, the "Revolutionary University," was classified by
Lifton as the "hard core of the entire Chinese thought reform
movement" (Lifton 1961,p. 248).
Attribution theories would predict that if there were differences between the power of reform programs to promote belief change in settings that were relatively more or less blatantly coercive and physically threatening, the effect would be greatest in less-coercive programs. Consistent with this expectation, Lifton concluded that reform efforts directed against Chinese citizens were "much more successful" than efforts directed against Westerners (Lifton 1961, p. 400).
[The extremist "political" cult I was able to
infiltrate in the '00s was not out to change political beliefs so much as to
rather cynically manipulate the "true believers" (as per Hoffer) to
work themselves raw on behalf of The Beloved Cause as door-to-door and house
party fund raisers and telephone callers "getting out the vote." The
language used was neuro linguistic programmese, straight-up, and closely
resembled the style of several of the big, West Coast LGATs of the '70s and
'80s, but the techniques were right out of the MLMs.]
A second myth concerns the purported effects of brainwashing.
Media reports about thought reform's effects far exceed the findings of
scientific studies--which show coercive persuasion's upper limit of impact to
be that of inducing personal confusion and significant, but typically
transitory, attitude change [It is useful to understand that the typical HPC
and LGAT guru knows he or she is running a revolving door operation for the
many... in which the few who "stick" will do so because they are
either Just Plain Stupid(ified into codependence; see Mellody, Schaef and Weinholf & Weinhold) or they are as cynical as the guru and want a piece of
the pie. Some will blackmail for it, of course, and be allowed by the guru to
extort funds and other perks for so doing. I watched a gorgeous former call
girl and porn actress / loop producer who thought she could run her ex-husband's "ministry for schizoid
former skid row drunks" get snagged this way. She had to sell off the
mansion to get enough cash to keep the thugs happy until her attorneys and
other thugs were able to deal with the troublemakers.] Brainwashing was
promoted as capable of stripping victims of their capacity to assert their
wills, thereby rendering them unable to resist the orders of their controllers.
People subjected to "brainwashing" were not merely influenced to
adopt new attitudes but, according to the myth, suffered essentially an alteration
in their psychiatric status from normal to pathological, while losing their
capacity to decide to comply with or resist orders.
This lurid promotion of the power of thought reforming influence techniques to change a person's capacity to resist direction is entirely without basis in fact: No evidence, scientific or otherwise, supports this proposition. No known mental disorder produces the loss of will that is alleged to be the result of brainwashing. Whatever behavior and attitude changes result from exposure to the process, they are most reasonably classified as the responses of normal individuals to a complex program of influence. [Much as I agree with most of this article, I have to take issue with this... issue. My experience is that there are some who are unconsciously -- as opposed to consciously and cynically -- anti-social, sociopathic, psychopathic and even sadistic (see Millon et al, Hare, and Zimbardo) without being anywhere near as masochistic as the majority of the (from the point of view of the gurus) "saps" or "lops" in most of these deals. These people walked through the door ready to be used so long as there was something in it for them, and they took to The Program like ducks to water. Frankly, I don't see that as much different from the extreme, pre-entry codependence that's rife among some of the newbies. In both circumstances, they come with psychological prerequisites (including sociopathic narcissism; see Millon et al) that predispose them to be affected in the long term by the empowerments the cult provides that they see (however unconsciously) as their just due. Many of them move up the pyramidic ladders in these organizations (one sees it all over the place in the CoS) as hope-to-die-in-the-cloth "true believers" (see Hoffer). And many of them do, despite having egregiously discomfiting cognitive dissonance about what they are doing. Such people are in the distinct minority, however.]
The U.S. Central Intelligence Agency seems to have taken
seriously the myth about brainwashing's power to destroy the will. Due,
perhaps, to concern that an enemy had perfected a method for dependably
overcoming will -- or perhaps in hope of being the first to develop such a
method --the Agency embarked on a research program, code-named MK ULTRA. It
became a pathetic and tragic failure. On the one hand, it funded some innocuous
and uncontroversial research projects; on the other, it funded or supervised
the execution of several far-fetched, unethical, and dangerous experiments that
failed completely (Marks 1979; Thomas 1989). [Sorry, friends. I am NOT going
into MKUltra past mentioning a few names like Lee Harvey Oswald, Sirhan Sirhan
and John Hinkley, and the notion that it's not about "destroying the
will." MIUltra was about putting a steering wheel on "the
voices." If you want to go there, be my guest. But wear a helmet and a
seatbelt.]
Although no evidence suggests that thought reform is a
process capable of stripping a person of the will to resist, a relationship
does exist between thought reform and changes in psychiatric status. The stress
and pressure of the reform process cause some percentage of psychological
casualties. To reduce resistance and to motivate behavior change,
thought-reform procedures rely on psychological stressors, induction of high
degrees of emotional distress, and on other intrinsically dangerous influence
techniques (Heide and Borkovec 1983). The process has a potential to cause
psychiatric injury, which is sometimes realized. The major early studies
(Hinkle and Wolfe 1961; Lifton 1961; Schein 1961) reported that during the
unfreezing phase individuals were intentionally stressed to a point at which
some persons displayed symptoms of being on the brink of psychosis. Managers
attempted to reduce psychological pressure when this happened, to avoid serious
psychological injury to those obviously near the breaking point. [Just figure
this: Psychosis is built on emotionally loaded belief. And emotions and belief
can be manipulated with things as innocuous as popular music, motion pictures
and fake news stories. Some even think Presidents no one would have dreamed of
could be elected this way. (Heavens!) If one can do that with the ostensibly
"sane," what can one do with the already insane?]
Contemporary programs speed up the reform process through
the use of more psychologically sophisticated and dangerous procedures to
accomplish destabilization. In contemporary programs the process is sometimes
carried forward on a large group basis, which reduces the ability of managers
to detect symptoms of impending psychiatric emergencies. In addition, in some
of the "therapeutic" ideologies espoused by thought reforming
organizations, extreme emotional distress is valued positively, as a sign of
progress. Studies of contemporary programs have reported on a variety of
psychological injuries related to the reform process. Injuries include
psychosis, major depressions, manic episodes, and debilitating anxiety (Glass,
Kirsch, and Parris 1977, Haaken and Adams 1983, Heide and Borkovec 1983; Higget
and Murray 1983; Kirsch and Glass 1977; Yalom and Lieberman 1971; Lieberman
1987; Singer and Ofshe 1990). [Yup. Seen it. At least twenty times. And when
their autonomic "fight, flight or freeze" (cause freeze is what
happens when one stays in emotionally loaded cognitive dissonance long enough)
nervous systems have been cranked to the max for a ten or twenty years, one will
have a nasty case of complex post-traumatic stress disorder (as per Levine,
Heller, McEwen, Ogden, Sapolsky, Selye, van der Kolk, and Wolpe) with all
manner of cognitive distortions and defense mechanisms to try to
cope with it. Feh.]
Contemporary thought-reform programs are generally far
more sophisticated in their selection of both destabilization and influence
techniques than were the programs studied during the 1950s (see Ofshe and
Singer 1986 for a review). For example, hypnosis was entirely absent from the
first programs studied but is often observed in modern programs. [I have never
seen an LGAT that did not subject the assembled multitude to unannounced,
unclarified, unwarned hypnotic trances. I have myself been in such traces for
hours seated on stack chairs or laying on the carpet in hotel and convention
center ballrooms with hundreds of others. The Asian gurus of old were doing the
exact same thing millennia ago (see Fronsdal). Sufi group meditation is mass hypnosis (see Deikman, and Tart). (And
what do you think "chanting" is, Elmer?)] In most modern examples in
which hypnosis is present, it functions as a remarkably powerful technique for
manipulating subjective experience and for intensifying emotional response.
[It's the nastiest, most cynical use of meditation I know of. But adulterating
the good with the not-so-good is even... biblical, isn't it? (Sigh.)] It
provides a method for influencing people to imagine impossible events such as
those that supposedly occurred in their "past lives," the future, or
during visits to other planets. If persons so manipulated misidentify the
hypnotically induced fantasies, and classify them as previously unavailable
memories [been there; done that], their confidence in the content of a
particular ideology can be increased (Bainbridge and Stark 1980).
Hypnosis can also be used to lead people to allow
themselves to relive actual traumatic life events (e.g., rape, childhood sexual
abuse, near-death experiences, etc.) or to fantasize the existence of such
events and, thereby, stimulate the experience of extreme emotional distress
[sometimes inducing PTSD where none previously existed; see Levine, Heller, McEwen, Ogden, Sapolsky, Selye, van der Kolk, and Wolpe]. When imbedded in a
reform program, repeatedly leading the person to experience such events can
function simply as punishment, useful for coercing compliance. [What is not
mentioned here is the classic set-up of inducing affective discomfort so that
the guru can appear to rescue the sufferer from it. Because psychic or physical
punishment, and rescue of the convinced victim, are the essence of interpersonal
attachment (see Bowlby, Cassidy & Shaver, and Shaver) on the Karpman Drama Triangle. It's the stuff many AFs, HPCs and
LGATs, and almost all ECRCs, are made of.]
Accounts of contemporary programs also describe the use of
sophisticated techniques intended to strip away psychological defenses, to
induce regression to primitive levels of coping, and to flood targets with
powerful emotion (Ayalla 1985; Haaken and Adams 1983; Hockman 1984; Temerlin
and Temerlin 1982). In some instances stress and fatigue have been used to
promote hallucinatory experiences that are defined as therapeutic (Gerstel
1982). [All of the LGATs I ran into used mass regression, emotion flooding,
stress and autonomic abuse achieve what I heard famed football coach Vince
Lombardi say on TV 40 years ago: "Fatigue makes cowards of us all."]
Drugs have been used to facilitate disinhibition and heightened suggestibility [as in the Krishna Consciousness meditation cult of the '60s and '70s] (Watkins 1980). Thought-reform subjects have been punished for disobedience by
being ordered to self-inflict severe pain, justified by the claim that the result
will be therapeutic (Bellack et al. v. Murietta Foundation et al.).
Programs of coercive persuasion appear in various forms in
contemporary society. They depend on the voluntary initial participation of
targets. This is usually accomplished because the target assumes that there is
a common goal that unites him or her with the organization or that involvement
will confer some benefit (e.g., relief of symptoms, personal growth, spiritual
development, etc.). Apparently some programs were developed based on the assumption
that they could be used to facilitate desirable changes (e.g., certain
rehabilitation or psychotherapy programs). Some religious organizations and
social movements utilize them for recruitment purposes. Some commercial
organizations utilize them as methods for promoting sales. Under unusual
circumstances, modern police-interrogation methods can exhibit some of the
properties of a thought-reform program. In some instances, reform programs
[including some ECRC- and HPC-operated "adolescent behavior reform schools" and SATPs] appear to have been operated for the sole purpose of gaining
a high degree of control over individuals to facilitate their exploitation
(Ofshe 1986; McGuire and Norton 1988; Watkins 1980).
Virtually any acknowledged expertise or authority can
serve as a power base to develop the social structure necessary to carry out
thought reform. In the course of developing a new form of rehabilitation,
psychotherapy, religious organization, utopian community, school, or sales
organization it is not difficult to justify the introduction of thought-reform
procedures.
Perhaps the most famous example of a thought-reforming
program developed for the ostensible purpose of rehabilitation was Synanon, a
drug treatment program (Sarbin and Adler 1970, Yabionsky 1965; Ofshe et al.
1974). The Synanon environment possessed all of Lifton's eight themes. It used
as its principle coercive procedure a highly aggressive encounter/therapy group
interaction. In form it resembled "struggle groups" observed in China
(Whyte 1976), but it differed in content. Individuals were vilified and
humiliated not for past political behavior but for current conduct as well as
far more psychologically intimate subjects, such as early childhood
experiences, sexual experiences, degrading experiences as adults, etc. The
coercive power of the group experience to affect behavior was substantial as
was its ability to induce psychological injury (Lieberman, Yalom, and Miles
1973; Ofshe et al. 1974).
Allegedly started as a drug-rehabilitation program,
Synanon failed to accomplish significant long-term rehabilitation. Eventually,
Synanon's leader, Charles Diederich, promoted the idea that any degree of drug
abuse was incurable and that persons so afflicted needed to spend their lives
in the Synanon community. Synanon's influence program was successful in convincing
many that this was so. Under Diederich's direction, Synanon evolved from an
organization that espoused non-violence into one that was violent. Its soldiers
were dispatched to assault and attempt to murder persons identified by
Diederich as Synanon's enemies (Mitchell, Mitchell, and Ofshe 1981).
The manipulative techniques of self-styled messiahs, such
as People's Temple leader Jim Jones (Reiterman 1982), and influence programs
operated by religious organizations, such as the Unification Church (Taylor
1978) arid Scientology (Wallis 1977; Bainbridge and Stark 1980), can be
analyzed as thought-reform programs. The most controversial recruitment system
operated by a religious organization in recent American history was that of the
Northern California branch of the Unification Church (Reverend Mr. Moon's
organization). The influence program was built directly from procedures of
psychological manipulation that were commonplace in the human-potential
movement (Bromley and Shupe 1981). The procedures involved various group-based
exercises as well as events designed to elicit from participant's information
about their emotional needs and vulnerabilities. Blended into this program was
content intended slowly to introduce the newcomer to the group's ideology.
Typically, the program's connection with the Unification Church or any
religious mission was denied during the early stages of the reform process. The
target was monitored around the clock and prevented from communicating with
peers who might reinforce doubt and support a desire to leave. The physical
setting was an isolated rural facility far from public transportation.
Initial focus on personal failures, guilt-laden memories,
and unfulfilled aspirations shifted to the opportunity to realize infantile desires
and idealistic goals, by affiliating with the group and its mission to save the
world. The person was encouraged to develop strong affective bonds with current
members. They showed unfailing interest, affection, and concern [including "love bombing"], sometimes to the point of spoon-feeding the person's meals and
accompanying the individual everywhere, including to the toilet. If the
unfreezing and change phases of the program succeeded, the individual was told
of the group's affiliation with the Unification Church and assigned to another
unit of the organization within which re- freezing procedures could be carried
forward.
Influence [see Cialdini; literally in-flow-ence] procedures now commonly used
during modern police interrogation can sometimes inadvertently manipulate
innocent persons' beliefs about their own innocence and, thereby, cause them
falsely to confess. Confessions resulting from accomplishing the unfreezing and
change phases of thought reform are classified as coerced-internalized false
confessions (Kassin and Wrightsman 1985; Gudjonsson and MacKeith 1988).
Although they rarely come together simultaneously, the ingredients necessary to
elicit a temporarily believed false confession are: erroneous police suspicion,
the use of certain commonly employed interrogation procedures, and some degree
of psychological vulnerability in the suspect. Philip Zimbardo (1971) has
reviewed the coercive factors generally present in modern interrogation
settings. Richard Ofshe (1989) has identified those influence procedures that
if present in a suspect's interrogation contributes to causing unfreezing and
change.
Techniques that contribute to unfreezing include falsely
telling a suspect that the police have evidence proving the person's guilt
(e.g., fingerprints, eyewitness testimony, etc.). Suspects may be given a
polygraph examination and then falsely told (due either to error or design)
that they failed and the test reveals their unconscious knowledge of guilt.
Suspects may be told that their lack of memory of the crime was caused by an
alcohol or drug induced blackout, was repressed, or is explained because the
individual is a multiple personality.
The techniques listed above regularly appear in modern
American police interrogations. They are used to lead persons who know that
they have committed the crime at issue to decide that the police have
sufficient evidence to convict them or to counter typical objections to
admitting guilt (e.g., "I can't remember having done that."). In
conjunction with the other disorienting and distressing elements of a modern
accusatory interrogation, these tactics can sometimes lead innocent suspects to
doubt themselves and question their lack of knowledge of the crime. If innocent
persons subjected to these sorts of influence techniques do not reject the false
evidence and realize that the interrogators are lying to them, they have no
choice but to doubt themselves.
Tactics used to change the suspect's position and elicit a
confession include maneuvers designed to intensify feelings of guilt and
emotional distress following from the suspect's assumption of guilt. Suspects
may be offered an escape from the emotional distress through confession. It may
also be suggested that confession will provide evidence of remorse that will
benefit the suspect in court.
Thought reform is not an easy process to study for several
reasons. The extraordinary totalistic qualities and hyperorganization of
thought-reforming environments, together with the exceptional nature of the
influence tactics that appear within them, put the researcher in a position
roughly analogous to that of an anthropologist entering into or interviewing
someone about a culture that is utterly foreign. The researcher cannot assume
that he or she understands or even knows the norms of the new environment. This
means that until the researcher is familiar with the constructed environment
within which the reform process takes place, it is dangerous to make the
routine assumptions about context that underlie research within one's own
culture. This problem extends to vocabulary as well as to norms and social
structure. [Times have changed. There are lots of "informed, school
trained" exiters now. And we're out there pretty calmly taking it all in
with detachment, and without fear of being manipulated. If one got his or her
mindfulness from Jiddu Krishnamurti, Rama Maharshi, Alan Watts, S. N. Goenka,
Daniel Goleman, Chogyam Trungpa, Pema Chodron, Anthony de Mello, Jean Klein,
Arthur Deikman, Charles Tart, Stephen Levine, Jon Kabat-Zinn, Marsha Linehan,
Joel Kramer, Eckhart Tolle, Tara Brach, Stephen Hayes, Mark Williams, Daniel
Siegel, Stephen Batchelor, Gil Fronsdal, and the like -- and processed what happened to them in their own AFs -- they're not going to
fall for the caca shoveled out in any of the MLMs, HPCs, LGATs, or ECRCs.]
The history of research on the problem has been one in
which most of the basic descriptive work has been conducted through post-hoc
interviewing of persons exposed to the procedures. The second-most frequently
employed method has been that of participant observation. Recently, in
connection with work being done on police interrogation methods, it has been
possible to analyze contemporaneous recordings of interrogation sessions in
which targets' beliefs are actually made to undergo radical change. All this
work has contributed to the development of an understanding of the
thought-reform phenomenon in several ways.
Studying the reform process demonstrates that it is no
more or less difficult to understand than any other complex social process and
produces no results to suggest that something new has been discovered. The only
aspect of the reform process that one might suggest is new, is the order in
which the influence procedures are assembled and the degree to which the
target's environment is manipulated in the service of social control. This is
at most an unusual arrangement of commonplace bits and pieces.
Work to date has helped establish a dividing line between
the lurid fantasies about mysterious methods for stripping one's capacity to
resist control and the reality of the power of appropriately designed social
environments to influence the behavior and decisions of those engaged by them.
Beyond debunking myths, information gathered to date has been used in two ways
to further the affirmative understanding of thought reform: It has been possible
to develop descriptions of the social structure of thought-reforming
environments, of their operations, and to identify the range of influence
mechanisms they tend to incorporate; the second use of these data has been to
relate the mechanisms of influence present in the reform environment to
respondents' accounts of their reactions to these experiences, to increase
understanding of both general response tendencies to types of influence
mechanisms and the reactions of particular persons to the reform experience.
As it is with all complex, real-world social phenomena that cannot be studied experimentally, understanding information about the thought-reform process proceeds through the application of theories that have been independently developed. Explaining data that describe the type and organization of the influence procedures that constitute a thought-reform process depends on applying established social-psychological theories about the manipulation of behavior and attitude change. Assessing reports about the impact on the experiences of the personalities subjected to intense influence procedures depends on the application of current theories of personality formation and change. Understanding instances in which the reform experience appears related to psychiatric injury requires proceeding as one would ordinarily in evaluating any case history of a stress-related [as in PTSD; see Levine, Heller, McEwen, Ogden, Sapolsky, Selye, van der Kolk, and Wolpe; because many of the exiters I have encountered have pretty obvious PTSD symptoms (including unremitting anxiety, mania and/or depression) and complex defense mechanism schemes therefore (see Vaillant), including several types of DSM Axis II Cluster B personality disorders] or other type of psychological injury.
Steven Hassan's BITE Model
of Cult Characteristics
To illustrate the mechanisms by which such stress and downline PTSD is induced, I've added the BITE Model (available for viewing at Hassan's very useful https://freedomofmind.com/ website). What is "stressful" should be obvious.
Behavior Control
Promote dependence and
obedience.
Modify behavior with
rewards and punishments.
Dictate where and with
whom you live.
Restrict or control
sexuality.
Control clothing and
hairstyle.
Regulate what and how much
you eat and drink.
Deprive you of seven to
nine hours of sleep.
Exploit you financially.
Restrict leisure time and
activities.
Require you to seek
permission for major decisions.
Information Control
Deliberately withhold and
distort information.
Forbid you from speaking
with ex-members and critics.
Discourage access to
non-cult sources of information.
Divide information into
"insider" vs. "outsider" doctrine.
Generate and use propaganda
extensively.
Use information gained in
confession sessions against you.
Gaslight to make you doubt
your own memory.
Require you to report
thoughts, feelings, & activities to superiors.
Encourage you to spy and
report on others’ “misconduct.”
Thought Control
Instill black vs. white,
us vs. them & good vs. evil thinking.
Change your identity,
possibly even your name.
Use loaded language and
cliches to stop complex thought.
Induce hypnotic or trance
states to indoctrinate.
Teach thought-stopping techniques
to prevent critical thoughts.
Allow only positive
thoughts.
Use excessive meditation,
singing, prayer & chanting to block thoughts.
Reject rational analysis,
critical thinking, & doubt.
Emotional Control
Instill irrational fears
(phobias) of questioning or leaving the group.
Label some emotions as
evil, worldly, sinful, or wrong.
Teach emotion-stopping
techniques to prevent anger, homesickness.
Promote feelings of guilt,
shame & unworthiness.
Shower you with praise and
attention (“love bombing”).
Threaten your friends and
family.
Shun you if you disobey or
disbelieve.
Teach that there is no
happiness or peace outside the group.
Liabilities of LGAT Mass Marathons
Gottschalk and Pattison's 13 liabilities of encounter groups (1969) (reprinted and excerpted from Cushman, 1993), some of which are similar to characteristics of most current mass marathon psychotherapy / large group awareness training sessions:
11) The sometimes focus too much on structural self-awareness techniques and misplace the goal of democratic education; as a result participants may learn more about themselves (some of which they may find egregiously discomfiting owing to conflicts with their moral and ethical beliefs; see Kohlberg) and less about (the manipulative, in-doctrine-ating) group process.
12) The pay inadequate attention to decisions regarding time limitations. This may lead to increased pressure on some participants to "fabricate" a cure.
13) They fail to adequately consider the "psychonoxious" or deleterious effects of group participation or adverse countertransference reactions.
Causes of Psychiatric Casualties
By sending out researchers to attend the basic trainings of several LGATs, Lieberman, Yalom & Miles (1973) observed an almost 9.4% rate of "psychiatric casualties." As excerpted from Cushman (see above), "The authors... determined that it was neither the psychological traits of the subjects (i.e., predispositional factors) nor the ideology of the leaders (i.e., doctrinal factors) that determined the casualty rate. Instead, surprisingly, it was the style of leadership that was primary. Leaders who were aggressive, stimulating, intrusive, confrontive, challenging, personally revealing, and authoritarian were the leaders who caused the casualties.
"Leaders...
1) had rigid, unbending beliefs about what participants should experience and believe, how they should behave in the group. and when they should change.
4) were true believers and sealed their
doctrine off from discomforting data or disquieting results and tended to
discount a poor result by, 'blaming the victim.'"
Components of Forceful Indoctrination
UCLA Neuropsychiatric Unit head of service Louis West (a member of Edgar Schein's study group on North Korean reprogramming of US POW's minds, among other research groups) developed a list of "eight basic components" of "forceful indoctrination" that was reported in an article in the Los Angeles Times in 1979. The list was summarized in Mithers (1994).
"Captors had to...
1) require prisoners to obey trivial demands, such as following minute rules and schedules. Such obedience gave the prisoners the habit of compliance.
2) demonstrate their omnipotence over their prisoners, thereby suggesting their resistence was futile.
3) offer unpredictable [see "variable schedule of reinforcement," a technique of "operant" behavioral conditioning in Skinner's and Watson's work] indulgences, rewards for compliance, unexpected kindness, and promises of better treatment. This provided positive motivation for obedience.
4) threaten their prisoners with punishments like isolation and change in treatment. These threats would produce constant anxiety and despair.
5) degrade prisoners in various ways, deny them privacy, and impose demeaning and humiliating punishments. This made resistance more threatening to self-esteem than compliance.
6) control their prisoners' environments.
7) isolate prisoners into small groups that developed an intense focus upon the self.
8) induce exhaustion, which weakened prisoners' ability to resist.
I personally witnessed (and was subjected to) -- or was a first-hand recipient of reports of -- such treatment by members at the seventh through ninth layers of the cultic pyramids of several human potential cults during the 1970s, including the one that was the subject of Mithers's very highly recommended, descriptive and detailed book, Therapy Gone Mad.
Liabilities of LGAT Mass Marathons
Gottschalk and Pattison's 13 liabilities of encounter groups (1969) (reprinted and excerpted from Cushman, 1993), some of which are similar to characteristics of most current mass marathon psychotherapy / large group awareness training sessions:
1) They lack adequate
participant-selection criteria.
2) They lack reliable
norms, supervision, and adequate training for leaders.
3) They lack clearly
defined responsibility.
4) They sometimes foster pseudoauthenticity
and pseudoreality.
5) They sometimes foster inappropriate
patterns of relationships.
6) They sometimes ignore
the necessity and utility of ego defenses.
7) They sometimes teach
the covert value of total exposure instead of valuing personal differences.
8) They sometimes foster impulsive
personality styles and behavioral strategies.
9) They sometimes devalue
critical thinking in favor of "experiencing" without self-analysis or
reflection.
10) They sometimes ignore stated goals, misrepresent their actual techniques, and obfuscate their real agenda.
10) They sometimes ignore stated goals, misrepresent their actual techniques, and obfuscate their real agenda.
11) The sometimes focus too much on structural self-awareness techniques and misplace the goal of democratic education; as a result participants may learn more about themselves (some of which they may find egregiously discomfiting owing to conflicts with their moral and ethical beliefs; see Kohlberg) and less about (the manipulative, in-doctrine-ating) group process.
12) The pay inadequate attention to decisions regarding time limitations. This may lead to increased pressure on some participants to "fabricate" a cure.
13) They fail to adequately consider the "psychonoxious" or deleterious effects of group participation or adverse countertransference reactions.
Causes of Psychiatric Casualties
By sending out researchers to attend the basic trainings of several LGATs, Lieberman, Yalom & Miles (1973) observed an almost 9.4% rate of "psychiatric casualties." As excerpted from Cushman (see above), "The authors... determined that it was neither the psychological traits of the subjects (i.e., predispositional factors) nor the ideology of the leaders (i.e., doctrinal factors) that determined the casualty rate. Instead, surprisingly, it was the style of leadership that was primary. Leaders who were aggressive, stimulating, intrusive, confrontive, challenging, personally revealing, and authoritarian were the leaders who caused the casualties.
"Leaders...
1) had rigid, unbending beliefs about what participants should experience and believe, how they should behave in the group. and when they should change.
2) had no sense of
differential diagnosis and assessment skills, valued cathartic emotional breakthroughs
as the ultimate therapeutic experience, and sadistically pressed to create or
force a breakthrough in every participant.
3) had an evangelical
system of belief that was the one single pathway to salvation.
Goleman's Warnings You Might be in a Cult
From: Daniel Goleman: Early Warning Signs for the Detection
of Spiritual Blight, in The Newsletter of Association for Transpersonal
Psychology, Summer 1985.
1) Taboo Topics: questions that can't be asked, doubts that
can't be shared, misgivings that can't be voiced. For example. "Where does
all the money go? or "Does Yogi sleep with his secretary?"
2) Secrets: the suppression of information, usually tightly
guarded by an inner circle. For example, the answers "Swiss bank
accounts," or "Yes, he does... and that's why she had an
abortion."
3) Spiritual Clones: in its minor form, stereotypic
behavior, such as people who walk, talk, smoke, eat and dress just like their
leader; in its much more sinister form, psychological stereotyping, such as an
entire group of people who manifest only a narrow range of feeling in any and
all situations: always happy, or pious, or reducing everything to a single
explanation, or sardonic, etc.
4) Groupthink: a party line that overrides how people
actually feel. Typically, the cognitive glue that binds the group. For example,
"You're fallen, and Christ is the answer," or "You're lost in samsara,
and Buddha is the answer" [Pali Canon or "real" Buddhists do not believe in any deities, by
the way), and "You're impure, and Shiva is the answer."
5) The Elect: a shared delusion of grandeur that there is no
Way but *this* one. The corollary: you're lost if you leave the group.
6) No Graduates: members are never weaned from the group.
Often accompanies the corollary above.
7) Assembly Lines: everyone is treated identically, no
matter what their differences; e.g., mantras assigned by dictates of a
demographical checklist.
8) Loyalty Tests: members are asked to prove loyalty to the
group by doing something that violates their personal ethics; for example, set
up an organization that has a hidden agenda of recruiting others into the
group, but publicly represents itself as a public service outfit.
9) Duplicity: the group's public face misrepresents its true
nature, as in the example just given.
10) Unifocal Understanding: a single world view is used to
explain anything and everything; alternate explanations are verboten. For
example, if you have diarrhea, it's the "Guru's Grace." If it stops,
it's *also* the Guru's Grace. And if you get constipated, it's still the Guru's
Grace.
11) Humorlessness: no irreverence allowed. Laughing at
sacred cows is bad for your health.
Components of Forceful Indoctrination
UCLA Neuropsychiatric Unit head of service Louis West (a member of Edgar Schein's study group on North Korean reprogramming of US POW's minds, among other research groups) developed a list of "eight basic components" of "forceful indoctrination" that was reported in an article in the Los Angeles Times in 1979. The list was summarized in Mithers (1994).
"Captors had to...
1) require prisoners to obey trivial demands, such as following minute rules and schedules. Such obedience gave the prisoners the habit of compliance.
2) demonstrate their omnipotence over their prisoners, thereby suggesting their resistence was futile.
3) offer unpredictable [see "variable schedule of reinforcement," a technique of "operant" behavioral conditioning in Skinner's and Watson's work] indulgences, rewards for compliance, unexpected kindness, and promises of better treatment. This provided positive motivation for obedience.
4) threaten their prisoners with punishments like isolation and change in treatment. These threats would produce constant anxiety and despair.
5) degrade prisoners in various ways, deny them privacy, and impose demeaning and humiliating punishments. This made resistance more threatening to self-esteem than compliance.
6) control their prisoners' environments.
7) isolate prisoners into small groups that developed an intense focus upon the self.
8) induce exhaustion, which weakened prisoners' ability to resist.
I personally witnessed (and was subjected to) -- or was a first-hand recipient of reports of -- such treatment by members at the seventh through ninth layers of the cultic pyramids of several human potential cults during the 1970s, including the one that was the subject of Mithers's very highly recommended, descriptive and detailed book, Therapy Gone Mad.
See also:
The Cult Education Institute did not include
the references cited in the estimable Dr. Ofshe's article. But they are
available via contact with the Institute info@culteducation.com.
My own references follow:
Adorno, T.;
Levinson, D.; et al: The Authoritarian Personality: Studies in Prejudice, orig.
pub, 1950, New York: W. W. Norton, 1993.
Altemeyer, R.: The
Authoritarian Specter, Boston: Harvard University Press, 1996.
Altemeyer, R.: The
Authoritarians, Charleston, SC: Lulu, 2006.
Arendt, H.: The
Origins of Totalitarianism (The Burden of Our Time), orig. pub. 1951, New York:
Harcourt, Brace, Jovanovich, 1973.
Arterburn, S.; Felton, J.: Toxic Faith: Understanding and Overcoming Religious Addition, Nashville: Oliver-Nelson, 1991.
Bandura, A.: Self-Efficacy: The Exercise of Control, San Francisco: W. H. Freeman, 1997.
Arterburn, S.; Felton, J.: Toxic Faith: Understanding and Overcoming Religious Addition, Nashville: Oliver-Nelson, 1991.
Bandura, A.: Self-Efficacy: The Exercise of Control, San Francisco: W. H. Freeman, 1997.
Batchelor, S.:
Buddhism Without Beliefs: A Contemporary Guide to Awakening, New York:
Riverhead / Penguin, 1997.
Bateson, G.,
Jackson, D., Haley, J.; et al: Perceval’s Narrative: A Patient’s Account of his
Psychosis, Palo Alto, CA: Stanford University Press, 1961.
Bateson, G.;
Jackson, D.; Haley, J.; Weakland, J.: Toward a Theory of Schizophrenia, in
Journal of Behavioral Science, Vol. 1, 1956.
Baumrind, D,:
Current Patterns of Parental Authority, a monograph in Developmental
Psychology, Volume 4, Number 1, Part 2, New York: American Psychological
Association, 1971.
Berger, P.;
Luckman, T.: The Social Construction of Reality: A Treatise in the Sociology of
Knowledge, New York: Doubleday, 1966.
Bermann, E.:
Scapegoat: The Impact of Death on an American Family, Ann Arbor: U. of Michigan
Press, 1973.
Bowlby, J.: A
Secure Base: Parent-Child Attachment and Healthy Human Development. London:
Routledge; New York: Basic Books, 1988.
Brach, T.: Radical
Acceptance: Embracing Your Life with the Heart of a Buddha, New York: Random
House / Bantam, 2004.
Branden, N.: The
Psychology of Self-Esteem, New York: Bantam Books, 1973.
Branden, N.: The
Disowned Self, New York: Bantam Books, 1976.
Burrow, T.: The
Social Basis of Consciousness, New York: Harcourt, Brace, 1927.
Cassidy, J.;
Shaver, P., eds.: Handbook of Attachment: Theory, Research and Clinical
Applications, New York: Guilford Press, 1999.
Chodron, P.: The
Places That Scare You: A Guide to Fearlessness in Difficult Times, Boston:
Shambhala, 2001.
Chodron, P.:
Taking the Leap: Freeing Ourselves from Old Habits and Fears, Boston: Shambala,
2010.
Cialdini, R.:
Influence: Science and Practice, 4th Ed., New York: Allyn and Bacon, 2000.
Conway, F.;
Siegelman, J.: Snapping: America's Epidemic of Sudden Personality Change, New
York: Dell Delta, 1978.
Cushman,
P.: The Politics of Transformation: Recruitment
-Indoctrination Processes in a Mass Marathon Psychology
Organization, New York: St. Martin's Press, 1993.
Deikman, A.:
Personal Freedom: On Finding Your Way to the Real World, New York: Bantam,
1976.
Deikman, A.: The
Observing Self: Mysticism and Psychotherapy, Boston: Beacon Press, 1982.
Deikman, A.: The
Wrong Way Home: Uncovering the Patterns of Cult Behavior in American Society,
Boston: Beacon Press, 1990.
Deikman, A.:
Meditations on a Blue Vase (Collected Papers), Napa CA: Fearless Books, 2014.
Deikman, A.: Them
and Us: Cult Thinking and the Terrorist Threat, Berkeley CA: Bay Tree, 2003.
de Mello, A.:
Awareness: The Perils and Opportunities of Reality, New York: Doubleday /
Image, 1990.
Erikson, E.:
Childhood and Society, New York: W. W. Norton, 1950, 1967, 1993.
Erikson, E.: Identity
and the Life Cycle, New York: W. W. Norton, 1959, 1980.
Erikson, E.: The
Problem of Ego Identity, in Stein, M., et al: Identity and Anxiety, Glencoe,
IL: The Free Press, 1960.
Esterson, A.: The
Leaves of Spring: Schizophrenia, Family and Sacrifice, London: Tavistock, 1972.
Esterson, A.;
Cooper, D.; Laing, R.: Results of Family-oriented Therapy with Hospitalized
Schizophrenics, in British Medical Journal, Vol. 2, 1965.
Forward, S.: Emotional Blackmail, When the People in Your Life Use Fear, Obligation, and Guilt to Manipulate You, New York: HarperPerennial, 1998.
Forward, S.: Emotional Blackmail, When the People in Your Life Use Fear, Obligation, and Guilt to Manipulate You, New York: HarperPerennial, 1998.
Fronsdal, G.: The
Buddha Before Buddhism, Boulder, CO: Shambala, 2016.
Galanter, M.:
Cults: Faith, Healing and Coercion, New York: Guilford Press, 1989.
Goenka, S. N., in
Hart, W.: The Art of Living: Vipassana Meditation as Taught by S. N. Goenka,
San Francisco: Harper-Collins, 1987.
Goleman, D.: The
Meditative Mind: The Varieties of Meditative Experience, New York: Putnam &
Sons, 1988.
Gottschalk, L.;
Pattison, E.: Psychiatric perspectives on T-groups and the laboratory
movement: an overview, in American Journal of Psychiatry, Vol. 126, No. 6,
December 1969.
Haley, J.: The Art of Being Schizophrenic, in The Power Tactics of Jesus Christ... and Other Essays, New York: Penguin, 1969.
Haley, J.: The
family of the schizophrenic: a model system, in American Journal of Nervous and
Mental Disorders, Vol. 129, 1959.
Hare, R.: Without
Conscience, New York: Guilford Press, 1993.
Harris, S.: Waking
Up: A guide to Spirituality Without Religion, New York: Simon & Schuster,
2014.
Hassan, S.: Combating Cult Mind Control: The #1 Best-Selling Guide to Protection, Rescue and Recovery from Destructive Cults, South Paris ME: Park Street Press, 1989.
Hassan, S.:
Freedom of Mind: Helping Loved Ones Leave Controlling People, Cults &
Beliefs, Newton, MA: Freedom of Mind Press, 2012.
Hayes, S.;
Strosahl, K.; Preston, K.: Acceptance and Commitment Therapy: An Experiential
Approach to Behavior Change, New York: Guilford Press, 1999, 2003.
Hayes, S.;
Follete, V.; Linehan, M.: Mindfulness and Acceptance: Expanding the
Cognitive-Behavioral Tradition, New York: Guilford Press, 2004.
Heller, L.;
LaPierre, A.: Healing Developmental Trauma: How Early Trauma Effects
Self-Regulation, Self-Image, and the Capacity for Relationship (The
NeuroAffective Relational Model for restoring connection), Berkeley, CA: North
Atlantic Books, 2012.
Henry, J.: Culture
Against Man, New York: Random House, 1964.
Henry, J.:
Pathways to Madness, New York: Random House, 1965. Schizophrenia.
Henry, J.: On
Sham, Vulnerability and other forms of Self-Destruction, London: Allan Lane /
Penguin Press, 1973.
Hoffer, E.: The
True Believer: Thoughts on the Nature of Mass Movements, New York: Harper and
Row, 1951, 1966.
Jackson, D. (ed.):
The Etiology of Schizophrenia: Genetics / Physiology / Psychology / Sociology,
London: Basic Books, 1960.
Jackson, D.: Myths
of Madness: New Facts for Old Fallacies, New York: Macmillan & Co., 1964.
Kabat-Zinn, J.:
Full Catastrophe Living: Uasing the Wisdom of Your Body and Mind to Face
Stress, Pain and Illness, New York: Dell, 1990.
Kabat-Zinn, J.:
Mindfulness Meditation: Health benefits of an ancient Buddhist practice, in
Goleman, D.; Gurin, J., editors: Mind/Body Medicine, New York: Consumer Reports
Books, 1993.
Kabat-Zinn, J.:
Wherever You Go, There You Are: Mindfulness Meditation in Everyday Life: New
York: Hyperion, 2004.
Kabat-Zinn, J.:
Coming to Our Senses, Healing Ourselves and the World Through Mindfulness, New
York: Hyperion, 2005.
Karpman, S.: Fairy
tales and script drama analysis, in Transactional Analysis Bulletin, Vol. 7,
No. 26, 1968.
Klein, J.: Beyond
Knowledge, Oakland, CA: Non-Duality Press div. of New Harbinger, 1994, 2006.
Kohlberg, L.: The
Psychology of Moral Development: The Nature and Validity of Moral Stages, San
Francisco: Harper & Row, 1984.
Kramer, J.: The
Passionate Mind: A Manual for Living Creatively with One's Self, Berkeley:
North Atlantic Books, 1974.
Kramer, J.;
Alstad, D.: The Guru Papers: Masks of Authoritarian Power, Berkeley, CA: Frog,
Ltd., 1993.
Kramer, J.;
Alstad, D.: The Passionate Mind Revisited: Expanding Personal and Social
Awareness, Berkeley: North Atlantic Books, 2009.
Krishnamurti, J.:
Education and the Significance of Life, San Francisco: HarperSanFrancisco,
(1953) 1975.
Krishnamurti, J.;
Luytens, M.: The Krishnamurti Reader, New York: Penguin Arcana, (1954, 1963,
1964) 1970.
Krishnamurti, J.;
Huxley, A.: The First & Last Freedom, San Francisco: HarperSanFrancisco,
(1954) 1975.
Krishnamurti, J.:
As One Is: To Free the Mind from All Conditioning, Prescott AZ: Hohm Press,
(1955) 2007.
Krishnamurti, J.;
Luytens, M.: Freedom from the Known, San Francisco: HarperSanFrancisco, 1969.
Krishnamurti, J.:
The Awakening of Intelligence, San Francisco: HarperSanFrancisco, 1973, 1987.
Krishnamurti, J.:
On God, San Francisco: HarperSanFrancisco, 1992.
Krishnamurti, J.:
On Fear, San Francisco: HarperSanFrancisco, 1992.
Krishnamurti, J.:
On Love and Loneliness, San Francisco: HarperSanFrancisco, 1993.
Krishnamurti, J.:
The Book of Life: Daily Meditations with Krishnamurti, New York: HarperCollins,
1995.
Krishnamurti, J.:
Total Freedom: The Essential Krishnamurti, New York: HarperCollins, 1996.
Krishnamurti, J.:
This Light in Oneself: True Meditation, London: Shambala, 1999.
Kubler-Ross, E.:
On Death and Dying, New York: Macmillan, 1969.
Kubler-Ross, E.:
Death: The Final Stage if Growth, New York: Scribner, 1997.
Laing, R. D.;
Esterson, A.: Sanity, Madness and the Family, London: Tavistock, 1964.
Langone, M., ed.:
Recovery from Cults: Help for Victims of Psychological and Spiritual Abuse, New
York: W. W. Norton, 1993.
Levine, P.: In An
Unspoken Voice: How the Body Releases Trauma and Restores Goodness, Berkeley,
CA: North Atlantic Books, 2010.
Levine, S.: A
Gradual Awakening, New York: Anchor Books / Doubleday, 1979, 1989.
Levine, S. &
O.: Who Dies? An Investigation of Conscious Living and Conscious Dying, New
York: Doubleday, 1982.
Lidz, R.; Lidz,
T.: The family environment of schizophrenic patients, in American Journal of
Psychiatry, Vol. 106, 1949.
Lidz, T.:
The Origin and Treatment of Schizophrenic Disorders, New York: Basic Books,
1973.
Lidz, T.; Fleck,
S., Cornelison, A.: Schizophrenia and the Family, 2nd Ed.; New York:
International Universities Press, 1985.
Lieberman, M.;
Yalom, I.; Miles, M.: Encounter Groups: First Facts, New York: Basic Books,
1973.
Lifton, R. J.: Methods of Forceful Indoctrination, in Stein, M.; Vidich, A.; White, D. (editors): Identity and Anxiety: Survival of the Person in Mass Society, Glencoe, IL: The Free Press of Glencoe, Illinois, 1960.
Lifton, R. J.:
Boundaries: Psychological Man in Revolution, New York: Vintage, 1970.
Linehan, M.:
Cognitive–Behavioral Treatment of Borderline Personality Disorder, New York:
Guilford Press, 1993.
Maharshi, S. R.,
in Godman, D.: Be As You Are: The Teachings of Sri Ramana Maharshi, New York:
Penguin Press, (1982) 1991.
Marcia, J.:
Development and validation of ego identity status, in Journal of Personality
and Social Psychology, Vol. 3, 1966.
Martin, J.: The Kingdom of the Cults, Minneapolis: Bethany House, 1985.
McEwen, B.;
Seeman, T.: Protective and damaging effects of mediators of stress: Elaborating
and testing the concepts of allostasis and allostatic load, in Annals of the
New York Academy of Sciences, Vol. 896, 1999.
McEwen, B: Mood
Disorders and Allostatic Load, in Journal of Biological Psychiatry, Vol. 54,
2003.
McEwen, B.;
Lasley, E. N.: The End of Stress as We Know It, Washington, DC: The Dana Press,
2003.
Meerloo, J.:
Brainwashing and Menticide, in Stein, M.; Vidich, A.; White, D. (editors):
Identity and Anxiety: Survival of the Person in Mass Society, Glencoe, IL: The
Free Press of Glencoe, Illinois, 1960.
Mellody, P.;
Miller, A. W.: Facing Codependence: What It Is, Where It Come From, How It
Sabotages Our Lives, San Francisco: Harper, 1989.
Millon, T.;
Simonsen, E.; Birket-Smith, M.; Davis, R.: Psychopathy: Antisocial, Criminal,
and Violent Behavior, London: Guilford Press, 1998.
Mithers, C. L.: Therapy Gone Mad: The True Story of Hundreds of Patients and a Generation Betrayed, Menlo Park CA: Addison-Wesley Publishing, 1994.
Ogden, P.; Minton,
K.: Trauma and the Body: A Sensorimotor Approach to Psychotherapy, New York: W.
W. Norton, 2006.
Ogden, P.; Fisher,
J.: Sensorimotor Psychotherapy: Interventions for Trauma and Attachment, New
York: W. W. Norton, 2015.
Sapolsky, R.: Why
Zebras Don't Get Ulcers: The Acclaimed Guide to Stress, Stress-Related Diseases
and Coping, 3rd Ed., New York: Holt, 2004.
Sargant, W.: Battle for the Mind: A Physiology of Conversion and Brain-Washing, orig. pub. 1957, Cambridge, MA: Major Books, 1997.
Schaef, A. W.:
Co-dependence: Misunderstood, Mistreated, New York: HarperOne, 1992.
Segel, Z.;
Williams, J. M.; Teasdale, J.: Mindfulness-Based Cognitive Therapy for
Depression, London: Guilford Press, 2001.
Seligman, M.:
Learned Optimism: How to Change Your Mind and Your Life, New York: Knopf, 1990.
Selye, H.: Stress
Without Distress, Philadelphia: J. B. Lippencott, 1974.
Shaver, P.;
Mikulincer, M.: Psychodynamics of Adult Attachment: A Research Perspective, in
Journal of Attachment and Human Development, Vol. 4, 2002.
Siegel, D.:
Reflections on the Mindful Brain, in Mind Your Brain, Los Angeles: Lifespan
Learning Institute, 2007.
Siegel, D.: The
Mindful Therapist: A Clinician’s Guide to Mindsight and Neural Integration, New
York: W. W. Norton & Company, 2010.
Siegel, D.:
Mindsight: The New Science of Personal Transformation, New York: Bantam, 2010.
Singer, M. T.:
Cults in Our Midst, San Francisco: Jossey-Bass, 1995.
Skinner, B. F.:
Beyond Freedom and Dignity, New York: Alfred A. Knopf, 1971.
Skinner, B. F.:
About Behaviorism, New York: Random House, 1974.
Stein, A,: Terror, Love and Brainwashing: Attachment in Cults and Totalitarian Systems, London: Routledge, 2017.
Tart, C. (ed.):
Transpersonal Psychologies: Perspectives on the Mind from Seven Great Spiritual
Traditions, San Francisco: Harper-Collins, 1975, 1992.
Tart, C.: Waking
Up: Overcoming the Obstacles to Human Potential, New York: New Science Library,
1987.
Tart, C.: Living
the Mindful Life: a handbook for living in the present moment, Boston:
Shambala, 1994.
Tart, C.: Mind
Science: Meditation Training for Practical People, Napa, CA: Fearless Books,
2013.
Taylor, K.:
Brainwashing: The Science of Thought Control, London: Oxford University Press,
2004.
Tobias, M.; Lalich, J.: Captive Hearts, Captive Minds: Freedom and Recovery from Cults and Abusive Relationships, Alameda, CA: Hunter House, 1996.
Tolle, E.: The
Power of Now: A Guide to Spiritual Enlightenment, Novato, CA: New World
Library, 1999.
Trungpa, C.: The
Myth of Freedom and the Way of Meditation, Boston: Shambala, 1976, 2001.
Trungpa, C.:
Cutting Through Spiritual Materialism, Boston: Shambala: 1973, 2002.
Trungpa, C.: The
Heart of the Buddha, Boston: Shambala: 1991.
Vaillant, G.: Ego
Mechanisms of Defense: A Guide for Clinicians and Researchers, 1st Ed.,
Arlington, VA: American Psychiatric Publishing, 1992.
Van der Kolk, B.:
The Compulsion to Repeat the Trauma: Re-enactment, Re-victimization, and
Masochism, in Psychiatric Clinics of North America, Vol. 12, No. 2, 1989.
Van der Kolk, B.;
Hopper, J.; Osterman, J.: Exploring the Nature of Traumatic Memory:
Combining Clinical Knowledge with Laboratory Methods; in Journal of Aggression,
Maltreatment & Trauma, Vol. 4, No. 2, 2001.
Van der Kolk, B:
Traumatic Stress: The Effects of Overwhelming Experience on Mind, Body and
Society, New York: Guilford Press, 1996 / 2007.
Van der Kolk, B:
The Body Keeps the Score: Brain, Mind, and Body in the Healing of Trauma, New
York: Viking Press, 2014.
Watson, J.:
Behaviorism, Revised Edition, Chicago: University of Chicago Press, 1930.
Watts, A.: The
Wisdom of Insecurity: A Message for the Age of Anxiety, New York: Random House,
1951.
Watts, A.: The Way
of Zen, New York: Random House / Pantheon, 1957.
Watts, A.: Nature,
Man and Woman, New York: Random House, 1958.
Watts, A.:
Psychotherapy East and West, New York: Random House / Pantheon, 1961.
Watts, A.; Al
Chung-liang Huang: Tao: The Watercourse Way, New York: Pantheon,
1975.
Watzlawick, P.;
Beavin, J.; et al: Protection and scapegoating in pathological families, in
Family Process, Vol. 9, 1970.
Weiner, B.: An
attributional theory of motivation and emotion. New York: Springer-Verlag,
1986.
Weinhold, B.;
Weinhold, J.: Breaking Free of the Co-dependency Trap, Revised Edition, Novato,
CA: New World Library, 2008.
Williams, M.;
Penman, D.: Mindfulness: An Eight-Week Plan for Finding Peace in a Frantic
World, New York: Rodale, 2011.
Williams, M.;
Teasdale, J.; Segal, Z.; Kabat-Zinn, J.: The Mindful Way through Depression,
New York: Guilford Press, 2007.
Williams, M.;
Poijula, S.: The PTSD Workbook, Second Edition; Oakland, CA: New Harbinger,
2013.
Wilson, B.:
Alcoholics Anonymous, New York, A. A. World Services, 1939, 1955, 1976, 2001.
Wilson, B.: Twelve
Steps and Twelve Traditions, New York: A. A. World Services, 1951.
Wolpe, J.:
Psychotherapy by Reciprocal Inhibition, Palo Alto, CA: Stanford University
Press, 1958.
Wright, L.: Going
Clear: Scientology, Hollywood, & the Prison of Belief, New York: Alfred A.
Knopf, 2013.
Wright, R.: Why
Buddhism is True: The Science and Philosophy of Meditation and Enlightenment,
New York: Simon & Schuster, 2017.
Zimbardo, P.: The Lucifer Effect: Understanding How Good People Turn Evil, New
York: Random House, 2007.
No comments:
Post a Comment