COGSEC — Article 007¶
The Cargo Cult of Control¶
Profile of Copy-of-Copy Operators¶
Disclaimer¶
This article constitutes a literature review and theoretical analysis of psychological and social mechanisms documented in academic literature. It does not constitute:
- A diagnosis of any specific situation
- An accusation against identifiable individuals or institutions
- A substitute for professional evaluation (psychological, legal, medical)
- An incitement to self-diagnosis or action
The mechanisms described are drawn from works published in peer-reviewed journals (Journal of Personality and Social Psychology, Psychological Review, American Psychologist, American Sociological Review) and reference works in social psychology, organizational psychology, and sociology. The reader is invited to consult the primary sources and to discuss any personal application with a qualified professional.
Abstract¶
COGSEC006 established that conditioning cycles are reproduced through successive institutional copies, and that the copy of a copy generates the maximum damage with the minimum understanding (DiMaggio & Powell, 1983; Meyer & Rowan, 1977). This article examines the question left open: who are these operators?
The answer is counter-intuitive. The copy-of-copy operator is not a strategist. He is a cargo cult practitioner (Feynman, 1974): someone who reproduces the form of a mechanism without understanding its substance. His profile is documented at the intersection of five phenomena: automatic conformity (Asch, 1956), moral disengagement (Bandura, 1999), role absorption (Zimbardo, 2007; Goffman, 1959), normalization of deviance (Vaughan, 1996), and overestimation of one's own competence (Kruger & Dunning, 1999).
Crucial point: The cargo cult operator is dangerous because he is incompetent — not despite his incompetence. He applies techniques he does not understand, to targets he has not analyzed, with an intensity he does not calibrate. And he is convinced he is doing the right thing.
Keywords: cargo cult, moral disengagement, conformity, role absorption, normalization of deviance, Dunning-Kruger effect, social control operators, mimetic isomorphism
Note on the COGSEC Series¶
This project documents social and cognitive control mechanisms identified in academic literature. Previous articles established:
- COGSEC001: Foundational theoretical frameworks
- COGSEC002: The preventive briefing mechanism
- COGSEC003: N-dimensional cognitive architecture
- COGSEC004: The strategic error of targeting an analyst
- COGSEC005: The Triple Wall — Anatomy of the inability to name
- COGSEC006: Conditioning Cycles and their institutional reproduction
COGSEC006 (section 4.6) concluded: "Most conditioning cycles experienced in local contexts are not carried out by operators who understand the mechanism. They are copies of copies." This article draws the portrait of these operators.
1. Introduction¶
1.1 The open question¶
When an individual undergoes a conditioning cycle (COGSEC006), a question naturally arises: who does this?
The spontaneous answer constructs an adversary commensurate with the damage suffered: a strategist, a manipulator, a trained operator. The more severe the damage, the more sophisticated the assumed intent.
This attribution is understandable. It is also, in the majority of cases, false.
1.2 The competence hypothesis¶
The attribution of strategic intent to a local operator rests on what might be called the competence hypothesis: if the mechanism is complex, the operator must be as well.
Merton (1936) documented the fundamental error of this hypothesis. The consequences of an action imply neither the understanding nor the intent of the actor:
Reference
"No blanket imputation of foreknowledge of consequences can be made."
— Merton, R.K. (1936). The Unanticipated Consequences of Purposive Social Action. American Sociological Review, 1(6), 894-904, p. 895. DOI: 10.2307/2084615 | JSTOR
Systemic damage can be produced by an actor who has no idea what he is doing. And this is precisely the most common case.
1.3 The central thesis¶
The typical operator of a local conditioning cycle is not a strategist. He is a cargo cult practitioner — someone who executes the form of a control ritual without understanding its function. His profile lies at the intersection of five documented phenomena: conformity, moral disengagement, role absorption, normalization of deviance, and overestimation of competence.
And that is exactly what makes him dangerous.
2. The Cargo Cult: Form Without Substance¶
2.1 The origin of the concept (Feynman, 1974)¶
Richard Feynman, in his 1974 commencement address at Caltech, named the "cargo cult" phenomenon as applied to science:
Reference
"In the South Seas there is a cargo cult of people. During the war they saw airplanes land with lots of good materials, and they want the same thing to happen now. So they've arranged to make things like runways, to put fires along the sides of the runways, to make a wooden hut for a man to sit in, with two wooden pieces on his head like headphones and bars of bamboo sticking out like antennas — he's the controller — and they wait for the airplanes to land. They're doing everything right. The form is perfect. It looks exactly the way it looked before. But it doesn't work. No airplanes land."
— Feynman, R.P. (1974). Cargo Cult Science. Engineering and Science, 37(7), 10-13. Caltech Archives
Feynman applied this concept to science. The application to social control is direct.
2.2 The cargo cult of control¶
Applied to the mechanisms described in COGSEC001-006, the cargo cult of control manifests as follows:
| Cargo cult element | Equivalent in social control |
|---|---|
| The runway | The organizational structure (committees, procedures, meetings) |
| The fires along the runway | The signals of legitimacy (titles, badges, technical vocabulary) |
| The wooden hut | The institutional apparatus (offices, forms, hierarchy) |
| The bamboo bars | The copied techniques (briefing, cycles, evaluation) |
| Waiting for the planes | Waiting for the mechanism to "work" |
What is missing — what made the original mechanism work — is understanding. Why this technique and not another. What vulnerability it targets. When to apply it. When not to apply it. How to calibrate the intensity. How to measure the effect.
2.3 The cascade of degradation¶
COGSEC006 (section 4.3) described three levels of operators. This article examines the last one in depth:
ORIGINAL OPERATOR
├── Understands WHY the technique works
├── Knows the targeted vulnerability
├── Knows when NOT to use it
├── Calibrates the intensity
├── Measures the effects
├── Has institutional memory
└── = SURGEON
COPY (mimetic isomorphism — DiMaggio & Powell, 1983)
├── Knows THAT the technique exists
├── Copies the procedure
├── Applies systematically, without calibration
├── Does not measure side effects
├── Does not know when to stop
└── = BATTLEFIELD MEDIC
COPY OF A COPY (cargo cult)
├── Has seen others do it
├── Reproduces the FORM
├── Does not know the FUNCTION
├── "That's how it's done"
├── Cannot distinguish target from bystander
├── Does not know if it works
├── Continues anyway
└── = BUTCHER WITH A SCALPEL
It is this third level that causes the most damage in local contexts — family, professional, institutional. It is the most common profile. And it is the one that nobody documents, because it matches neither the stereotype of the "manipulator" nor that of the "villain."
3. Psychological Portrait: Five Converging Mechanisms¶
The cargo cult operator's profile does not stem from individual pathology. It emerges at the intersection of five documented mechanisms, all operating simultaneously in normal individuals.
3.1 Automatic conformity (Asch, 1956)¶
Solomon Asch experimentally demonstrated that the majority of individuals conform to a group consensus — even when that consensus contradicts perceptual evidence:
Reference
"That intelligent, well-meaning young people are willing to call white black is a matter of concern. It raises questions about our ways of education and about the values that guide our conduct."
— Asch, S.E. (1956). Studies of Independence and Conformity: I. A Minority of One Against a Unanimous Majority. Psychological Monographs: General and Applied, 70(9, Whole No. 416), 1-70. DOI: 10.1037/h0093718 | PsycNET
Asch's numbers: 75% of participants conformed at least once. One-third conformed on the majority of trials.
Application to the cargo cult of control: The local operator does not need to understand the mechanism. He needs to see others do it. Operational consensus — "that's how you handle this type of person" — is sufficient to trigger the reproduction of the cycle. The operator does not make a decision. He conforms.
3.2 Moral disengagement (Bandura, 1999, 2016)¶
Albert Bandura identified eight mechanisms through which ordinary individuals deactivate their internal moral controls:
Reference
"People do not ordinarily engage in harmful conduct until they have justified to themselves the morality of their actions. [...] What is culpable is rendered righteous through cognitive restructuring."
— Bandura, A. (1999). Moral Disengagement in the Perpetration of Inhumanities. Personality and Social Psychology Review, 3(3), 193-209, p. 194. DOI: 10.1207/s15327957pspr0303_3 | PubMed
The eight mechanisms of moral disengagement:
| Mechanism | Typical operator formulation |
|---|---|
| Moral justification | "It's for his own good" |
| Euphemistic language | "We're setting him straight", "We're protecting him" |
| Advantageous comparison | "It's better than what could happen to him" |
| Displacement of responsibility | "It's not my decision, I'm following instructions" |
| Diffusion of responsibility | "Everyone does the same" |
| Distortion of consequences | "He's not really suffering" |
| Dehumanization | "He's a case", "A risk profile" |
| Attribution of blame | "He brought it on himself", "If he hadn't..." |
Bandura (2016) deepened the analysis:
Reference
"The disengagement of moral self-sanctions permits people to do harmful things without self-condemnation. They perform harmful acts while preserving their moral self-regard."
— Bandura, A. (2016). Moral Disengagement: How People Do Harm and Live with Themselves. New York: Worth Publishers, p. 1. ISBN: 978-1-4641-6006-1. WorldCat OCLC 934754858
Application to the cargo cult: The copy-of-copy operator is not cynical. He is sincere. The mechanisms of moral disengagement are not conscious strategies — they are automatic cognitive processes that allow one to cause harm while perceiving oneself as a good person.
The operator who executes a conditioning cycle thinking "it's for his own good" is not a liar. He is an individual whose moral controls have been deactivated by the very mechanism he reproduces.
3.3 Role absorption (Zimbardo, 2007; Goffman, 1959)¶
Philip Zimbardo documented the speed with which normal individuals absorb an assigned role — and perform it with conviction:
Reference
"Good people can be induced, seduced, and initiated into behaving in evil ways. They can also be led to act in irrational, stupid, self-destructive, antisocial, and mindless ways when they are immersed in 'total situations' that impact human nature in ways that challenge our sense of the stability and consistency of individual personality, of character, and of morality."
— Zimbardo, P.G. (2007). The Lucifer Effect: Understanding How Good People Turn Evil. New York: Random House, p. x. ISBN: 978-1-4000-6411-3. WorldCat OCLC 71813017 | Open Library
The original experiment:
Reference
"At the end of only six days we had to close down our mock prison because what we saw was frightening. It was no longer apparent to most of the subjects (or to us) where reality ended and their roles began."
— Haney, C., Banks, C., & Zimbardo, P. (1973). Interpersonal Dynamics in a Simulated Prison. International Journal of Criminology and Penology, 1, 69-97, p. 69. PsycNET
Goffman (1959) had theorized the phenomenon before the experiment:
Reference
"When an individual plays a part he implicitly requests his observers to take seriously the impression that is fostered before them. [...] At one extreme, one finds that the performer can be fully taken in by his own act; he can be sincerely convinced that the impression of reality which he stages is the real reality."
— Goffman, E. (1959). The Presentation of Self in Everyday Life. New York: Anchor Books, p. 17. ISBN: 978-0-385-09402-3. WorldCat OCLC 59624115 | Internet Archive | Open Library
Application to the cargo cult: The copy-of-copy operator was not "recruited" or "trained." He was placed in a role — by the institutional context, by group pressure, by existing procedures — and he absorbed that role. Six days were enough in the Stanford experiment. Years of professional or family context produce total absorption. The operator no longer plays the role. He is the role.
3.4 Normalization of deviance (Vaughan, 1996)¶
Diane Vaughan, in analyzing the Challenger shuttle disaster, identified a mechanism she named "normalization of deviance":
Reference
"Social normalization of deviance means that people within the organization become so much accustomed to a deviant behaviour that they don't consider it as deviant, despite the fact that they far exceed their own rules for the elementary safety."
— Vaughan, D. (1996). The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. Chicago: University of Chicago Press, p. 409. ISBN: 978-0-226-85175-4. WorldCat OCLC 34050480 | Open Library
Vaughan's mechanism in four phases:
PHASE 1 — INITIAL DEVIATION
├── A behavior departs from the norm
├── The deviation is small, almost invisible
└── Nobody reacts
PHASE 2 — REPETITION
├── The deviation is repeated
├── Without immediately visible consequences
└── The absence of consequences is interpreted as validation
PHASE 3 — NORMALIZATION
├── The deviation becomes the new norm
├── Individuals who arrive find it "normal"
└── The deviant behavior is integrated into the culture
PHASE 4 — INABILITY TO PERCEIVE
├── Nobody sees the deviation anymore
├── The original norm is forgotten
├── Reporting the deviation is interpreted as excessive zeal
└── = THE DEVIANT HAS BECOME INVISIBLE
Application to the cargo cult: The copy-of-copy operator operates in an environment where conditioning cycles have been normalized. Treating someone as "a case" rather than as an individual — normal. Transmitting a briefing before an interaction — normal. Applying an interpretive filter based on a file rather than direct observation — normal.
The operator does not transgress a norm. He conforms to a deviant norm. And he is not able to see the deviance — because it was normalized before he arrived.
3.5 Overestimation of competence (Kruger & Dunning, 1999)¶
Justin Kruger and David Dunning documented a phenomenon that completes the portrait:
Reference
"People tend to hold overly favorable views of their abilities in many social and intellectual domains. [...] This overestimation occurs, in part, because people who are unskilled in these domains suffer a dual burden: Not only do these people reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the metacognitive ability to realize it."
— Kruger, J. & Dunning, D. (1999). Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments. Journal of Personality and Social Psychology, 77(6), 1121-1134, p. 1121. DOI: 10.1037/0022-3514.77.6.1121 | PubMed | PsycNET
The Dunning-Kruger effect applied to the cargo cult operator:
| Dimension | Competent operator (original) | Cargo cult operator |
|---|---|---|
| Self-assessment of competence | Knows what he knows AND what he doesn't know | Massively overestimates his understanding |
| Situation assessment | Analyzes before acting | Applies a template |
| Calibration | Adjusts the intensity | Single intensity (maximum or random) |
| Outcome assessment | Measures the effects | Measures nothing — "it's done" |
| Capacity for self-correction | Detects his errors | Incompetence prevents error detection |
The central paradox: The most dangerous operator is the one who is convinced of his competence while being the least capable of assessing the effects of his actions. He cannot learn from his mistakes because he does not perceive them.
4. The Composite Portrait¶
4.1 Synthesis of the five mechanisms¶
The five mechanisms converge toward a single profile:
THE CARGO CULT OPERATOR
│
├── CONFORMS (Asch)
│ └── Does what others do
│
├── JUSTIFIES (Bandura)
│ └── "It's for his own good"
│
├── IDENTIFIES WITH THE ROLE (Zimbardo/Goffman)
│ └── No longer distinguishes the role from himself
│
├── DOES NOT SEE THE DEVIANCE (Vaughan)
│ └── "It's normal, that's how it is"
│
├── BELIEVES HIMSELF COMPETENT (Kruger & Dunning)
│ └── Cannot assess his own incompetence
│
└── = NORMAL INDIVIDUAL
= NOT A MONSTER
= NOT A STRATEGIST
= SOMEONE WHO'S "DOING HIS JOB"
This is what Hannah Arendt identified in a radically different context:
Reference
"The trouble with Eichmann was precisely that so many were like him, and that the many were neither perverted nor sadistic, that they were, and still are, terribly and terrifyingly normal."
— Arendt, H. (1963). Eichmann in Jerusalem: A Report on the Banality of Evil. New York: Penguin Classics, p. 276. ISBN: 978-0-14-303988-4. WorldCat OCLC 405280 | Internet Archive | Open Library
Scale
The invocation of Arendt does not constitute a comparison of scale between the events she analyzed and the social control mechanisms documented in this series. What is transposed is the psychological mechanism — the normal individual who carries out a destructive process without measuring the consequences — and not the level of severity. The mechanisms of Bandura, Zimbardo, and Arendt operate on a continuum, from local micro-interactions to industrial-scale crime.
4.2 The operator's internal cycle¶
The cargo cult operator is caught in his own cycle:
PHASE 1 — INITIAL CONFORMITY
├── Observes the group's behavior toward the target
├── Receives the briefing (COGSEC002)
├── Conforms (Asch)
└── = FIRST EXECUTION
PHASE 2 — JUSTIFICATION
├── Feels discomfort (cognitive dissonance)
├── Activates moral disengagement (Bandura)
├── "It's for his own good" / "It's the procedure"
└── = DISCOMFORT RESOLVED
PHASE 3 — ABSORPTION
├── Repeats the execution
├── The role becomes identity (Zimbardo/Goffman)
├── No longer distinguishes execution from normalcy
└── = NORMALIZATION (Vaughan)
PHASE 4 — LOCK-IN
├── Has invested his identity in the role
├── Questioning the mechanism = questioning himself
├── Cognitive dissonance locks the position
├── Continues to execute — can no longer stop
└── = PRISONER OF THE MECHANISM HE OPERATES
4.3 Cognitive dissonance as a lock (Festinger, 1957)¶
Leon Festinger documented the mechanism that locks the internal cycle:
Reference
"If a person is induced to do or say something which is contrary to his private opinion, there will be a tendency for him to change his opinion so as to bring it into correspondence with what he has done or said."
— Festinger, L. (1957). A Theory of Cognitive Dissonance. Stanford: Stanford University Press, p. 87. ISBN: 978-0-8047-0131-0. WorldCat OCLC 295965 | Internet Archive | Open Library
Application: The operator who has executed a conditioning cycle can no longer recognize that the cycle was harmful — without simultaneously recognizing that he caused harm. Cognitive dissonance makes this recognition psychologically costly. It is less painful to continue believing the mechanism was justified than to recognize that it was not.
The more cycles the operator has executed, the more the cost of recognition increases. The lock-in is proportional to the number of past executions.
Festinger observed this mechanism in his classic field study:
Reference
"A man with a conviction is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point."
— Festinger, L., Riecken, H.W., & Schachter, S. (1956). When Prophecy Fails. Minneapolis: University of Minnesota Press, p. 3. ISBN: 978-1-59147-727-3. WorldCat OCLC 5765638 | Internet Archive | Open Library
5. A Taxonomy of Cargo Cult Operators¶
5.1 Three typical profiles¶
The cargo cult operator is not a single profile. Three variants emerge from the literature:
| Profile | Motivation | Dominant mechanism | Typical expression |
|---|---|---|---|
| The Zealot | Group belonging | Conformity (Asch) + Role absorption (Zimbardo) | "I do more than what's asked of me" |
| The Proceduralist | Psychological comfort | Displacement of responsibility (Bandura) + Normalization (Vaughan) | "I follow the procedures" |
| The Protector | Moral justification | Moral disengagement (Bandura) + Overestimation (Kruger & Dunning) | "It's to protect him from himself" |
5.2 The Zealot¶
The Zealot is the operator who does more than what the mechanism requires. His surplus of execution does not come from an instruction — it comes from his need to belong to the group that operates the mechanism.
Asch (1956) documented this behavior: some participants did not merely conform — they anticipated the group's response. The Zealot does not merely execute the cycle — he enriches it, extends it, intensifies it.
This is the most visible and easiest-to-identify profile. It is also the one that produces the most disproportionate damage, because his intensity is calibrated by nothing — not by understanding of the mechanism, not by instructions, not by feedback.
5.3 The Proceduralist¶
The Proceduralist is the operator who takes refuge behind the process. His protection mechanism is the displacement of responsibility (Bandura, 1999): it is not him who acts — it is the procedure acting through him.
Milgram (1974) had documented this profile:
Reference
"The most common adjustment of thought in the obedient subject is for him to see himself as not responsible for his own actions. He divests himself of responsibility by attributing all initiative to the experimenter, a legitimate authority."
— Milgram, S. (1974). Obedience to Authority: An Experimental View. New York: Harper Perennial, p. 8. ISBN: 978-0-06-176521-6. WorldCat OCLC 668026 | Internet Archive
The Proceduralist is the most invisible of the three profiles. He does not produce surplus — he produces exactly what the process requires. His damage is cold, systematic, perfectly deniable.
5.4 The Protector¶
The Protector is the most pernicious operator. He executes the conditioning cycle while sincerely believing he is protecting the target. His moral justification (Bandura, 1999) is total: he causes harm for the good of the target.
This profile is particularly common in family and medical contexts, where the vocabulary of "protection" is institutionally available. The parent who informs the school that their child is "fragile" (= briefing). The doctor who notes "to be monitored" in a transmitted file (= labeling). The friend who warns a new contact "be careful" (= preventive framing).
Reference
"Moral justification is the most potent mechanism of moral disengagement because it serves dual functions. It provides justification for harmful conduct and defends a moral self-concept."
— Bandura, A. (2016). Moral Disengagement: How People Do Harm and Live with Themselves. New York: Worth Publishers, p. 49. ISBN: 978-1-4641-6006-1. WorldCat OCLC 934754858
The Protector is the most resistant to confrontation. Showing him the damage he causes triggers massive dissonance — because the damage is committed in the name of good. Recognizing the damage would mean recognizing that his goodness was destructive. The dissonance is often resolved by blaming the target: "He doesn't realize what we do for him."
6. Group Dynamics: The Self-Sustaining System¶
6.1 Why the system runs without a conductor¶
One of the most frequent questions when facing a social control system: who coordinates?
The answer, in the majority of local cases: nobody.
Cialdini (2009) documented the social proof mechanism that enables this coordination without coordination:
Reference
"Whether the question is what to do with an empty popcorn box in a movie theater, how fast to drive on a certain stretch of highway, or how to eat the chicken at a dinner party, the actions of those around us will be important in defining the answer."
— Cialdini, R.B. (2009). Influence: Science and Practice. 5th ed. Boston: Pearson, p. 116. ISBN: 978-0-205-60999-4. WorldCat OCLC 227918150
Social proof produces coordination without instruction. If all members of a group treat an individual in a certain way, a new member will adopt the same treatment — not because he received an instruction, but because the group's behavior is the instruction.
6.2 Groupthink as an amplifier (Janis, 1982)¶
Irving Janis documented the groupthink phenomenon — the pathological convergence of group decision-making:
Reference
"The more amiability and esprit de corps there is among the members of a policy-making ingroup, the greater the danger that independent critical thinking will be replaced by groupthink."
— Janis, I.L. (1982). Groupthink: Psychological Studies of Policy Decisions and Fiascoes. 2nd ed. Boston: Houghton Mifflin, p. 13. ISBN: 978-0-395-31704-4. WorldCat OCLC 8597619
The symptoms of groupthink according to Janis:
| Symptom | Manifestation in the cargo cult |
|---|---|
| Illusion of invulnerability | "We've got this situation under control" |
| Collective rationalization | "This is the best possible approach" |
| Belief in the group's morality | "We're doing what's right" |
| Stereotyping of outsiders | "He's a case / a profile / a problem" |
| Pressure on dissenters | "You're not going to take his side, are you?" |
| Self-censorship | Doubts are not expressed |
| Illusion of unanimity | Silence is interpreted as agreement |
| Self-appointed gatekeepers | Protecting the group from contradictory information |
6.3 The operator as prisoner¶
The complete portrait of the cargo cult operator reveals a paradox:
The operator is a prisoner of the same system as the target.
The target is a prisoner of the conditioning cycles (COGSEC006). The operator is a prisoner of the mechanisms that maintain him in his role: conformity, cognitive dissonance, role absorption, groupthink.
The difference: the target knows he is a prisoner. The operator does not know.
TARGET:
├── Perceives the mechanism (often without being able to name it — COGSEC005)
├── Suffers consciously
├── Seeks a way out
└── = CONSCIOUS PRISONER
CARGO CULT OPERATOR:
├── Does not perceive the mechanism (he IS the mechanism)
├── Does not suffer (the justifications work)
├── Does not seek a way out (there is no perceived prison)
└── = UNCONSCIOUS PRISONER
WHAT PREVENTS THE OPERATOR FROM SEEING:
├── Conformity (Asch) → "Everyone does the same"
├── Dissonance (Festinger) → "I did the right thing"
├── Role absorption (Zimbardo) → "This is who I am"
├── Normalization (Vaughan) → "It's normal"
└── Dunning-Kruger → "I know what I'm doing"
7. The Cargo Cult as a Damage Amplifier¶
7.1 Why incompetence amplifies¶
A competent operator calibrates. A cargo cult operator does not calibrate.
The difference manifests across four dimensions:
| Dimension | Calibrated operator | Cargo cult |
|---|---|---|
| Target | Identified, analyzed | "Everyone who fits the profile" |
| Intensity | Minimum necessary | Maximum or random |
| Duration | Time-limited | Indefinite — "we keep going until someone tells us to stop" |
| Assessment | Feedback, adjustment | None — execution IS the result |
7.2 The inverted detection threshold¶
An additional paradox: the incompetent operator is easier to detect by the target (the pattern is crude) but harder to confront (there is no intent to demonstrate).
Accusing a cargo cult operator of manipulation is structurally incorrect. He does not manipulate. He reproduces. He has no strategic intent. He has an absorbed role.
And this is precisely what makes confrontation impossible within the usual framework: the categories of "manipulator" and "victim" do not apply. The cargo cult operator is both at once — and neither.
8. Countermeasures¶
8.1 The principle¶
CARGO CULT EFFECTIVENESS = f(MECHANISM INVISIBILITY × OPERATOR SINCERITY)
COUNTERMEASURE = MAKE THE MECHANISM VISIBLE TO THE OPERATOR HIMSELF
8.2 Limits of direct confrontation¶
Direct confrontation with a cargo cult operator is generally counterproductive:
| Approach | Operator's reaction | Result |
|---|---|---|
| "You're hurting me" | Sincere denial (moral disengagement) | Confirmation of the narrative ("he's difficult") |
| "You're applying a mechanism" | Incomprehension | "What are you talking about?" |
| "You're being manipulated" | Offense + identity defense | Reinforcement of the role |
| Documentation of facts | Cognitive dissonance → rationalization | Avoidance or escalation |
8.3 Documentation as environment¶
The most effective countermeasure is not directed at the operator. It is directed at the environment:
ENVIRONMENTAL COUNTERMEASURE:
│
├── 1. DOCUMENT the pattern (not the intent)
│ └── = Make the mechanism visible without accusing the actor
│
├── 2. PUBLISH in a verifiable format
│ └── = Create an external reference point
│
├── 3. USE academic vocabulary
│ └── = "Mimetic isomorphism", not "manipulation"
│ └── = The technical term is unassailable
│
├── 4. LET TIME act
│ └── = Some operators will reach the dissonance threshold
│ └── = Documentation creates a possible point of return
│
└── 5. DO NOT TARGET THE OPERATOR
└── = Target the MECHANISM
└── = The operator is a symptom, not the cause
└── = "Branch-agnostic" (COGSEC006)
8.4 The dissonance threshold¶
For some operators — not all — there exists a threshold beyond which cognitive dissonance can no longer be resolved through rationalization. This threshold is reached when:
- The damage becomes visible and can no longer be denied
- An external reference point exists (documentation, publication)
- The cost of rationalization exceeds the cost of recognition
When this threshold is reached, the operator has two options:
- Escalation: reinforce the mechanism to avoid seeing (Festinger, 1956)
- Break: recognize the mechanism and cease operating it
Documentation does not force the break. It makes it possible.
9. Limitations of the Analysis¶
9.1 Methodological limitations¶
| Limitation | Implication |
|---|---|
| Narrative literature synthesis | No original empirical study |
| Context transposition | The cited experiments (Asch, Zimbardo, Milgram) were conducted in controlled settings — transposition to social terrain is theoretical |
| Selection bias | The selected sources support the proposed analytical framework |
| Absence of quantitative data | The respective proportions of the three profiles (Zealot, Proceduralist, Protector) are not empirically established |
| Replication of cited studies | Some studies (notably Zimbardo, 1973) have been subject to methodological criticism and debates about their replicability |
9.2 Interpretive limitations¶
- Not all individuals who reproduce a group behavior are "cargo cult operators" — social conformity is a normal adaptive mechanism
- Applying the "cargo cult" label to an observed behavior does not prove the existence of an underlying control mechanism
- Incompetence does not imply the absence of intent — some local operators may have partial understanding
- The model does not distinguish between isomorphism (emergent reproduction) and coordination (explicit instruction) — this distinction may be undecidable from the target's perspective
9.3 Risks of misuse¶
This framework can be misused to:
- Dehumanize operators in return ("they're sheep")
- Avoid considering one's own responsibility in relational dynamics
- Justify a posture of cognitive superiority
- Confuse normal group behavior with a control mechanism
Analytical framework
The cargo cult of control is an analytical model, not a diagnosis. The majority of conformist behaviors are adaptive and non-pathological. The discriminating criterion is the reproduction of a documented conditioning pattern (COGSEC006) by an operator who does not understand its logic.
10. Conclusion¶
The typical operator of a local conditioning cycle is not a strategist. He is a normal individual, operating at the intersection of five documented psychological mechanisms:
- He conforms to the group's behavior (Asch, 1956)
- He justifies himself through moral disengagement (Bandura, 1999)
- He absorbs his role until he can no longer distinguish it from his identity (Zimbardo, 2007; Goffman, 1959)
- He does not see the deviance of his behavior (Vaughan, 1996)
- He overestimates his understanding of what he is doing (Kruger & Dunning, 1999)
And he is locked into this position by cognitive dissonance (Festinger, 1957): recognizing the mechanism would mean recognizing his own actions.
The result: a cargo cult practitioner who executes the form of a control mechanism without understanding its substance. Who produces disproportionate damage precisely because he does not calibrate. And who is incapable of stopping because he is incapable of seeing.
Central thesis
The cargo cult operator is dangerous because he is sincere. He is not lying when he says "it's for his own good." He is not pretending when he does not understand the question. He is not playing a role — he has become the role. And that is exactly what makes him resistant to confrontation and a producer of maximum damage. The form is perfect. No planes land. But the runway is impeccable.
Author's Declaration¶
The author declares:
- No financial conflict of interest
- No institutional affiliation at the time of writing
- That this article constitutes a contribution to the field of cognitive security (COGSEC)
References¶
Arendt, H. (1963). Eichmann in Jerusalem: A Report on the Banality of Evil. New York: Penguin Classics. ISBN: 978-0-14-303988-4. WorldCat OCLC 405280 | Internet Archive | Open Library
Asch, S.E. (1956). Studies of Independence and Conformity: I. A Minority of One Against a Unanimous Majority. Psychological Monographs: General and Applied, 70(9, Whole No. 416), 1-70. DOI: 10.1037/h0093718 | PsycNET
Bandura, A. (1999). Moral Disengagement in the Perpetration of Inhumanities. Personality and Social Psychology Review, 3(3), 193-209. DOI: 10.1207/s15327957pspr0303_3 | PubMed
Bandura, A. (2016). Moral Disengagement: How People Do Harm and Live with Themselves. New York: Worth Publishers. ISBN: 978-1-4641-6006-1. WorldCat OCLC 934754858
Cialdini, R.B. (2009). Influence: Science and Practice. 5th ed. Boston: Pearson. ISBN: 978-0-205-60999-4. WorldCat OCLC 227918150
DiMaggio, P.J. & Powell, W.W. (1983). The Iron Cage Revisited: Institutional Isomorphism and Collective Rationality in Organizational Fields. American Sociological Review, 48(2), 147-160. DOI: 10.2307/2095101 | JSTOR
Festinger, L. (1957). A Theory of Cognitive Dissonance. Stanford: Stanford University Press. ISBN: 978-0-8047-0131-0. WorldCat OCLC 295965 | Internet Archive | Open Library
Festinger, L., Riecken, H.W., & Schachter, S. (1956). When Prophecy Fails. Minneapolis: University of Minnesota Press. ISBN: 978-1-59147-727-3. WorldCat OCLC 5765638 | Internet Archive | Open Library
Feynman, R.P. (1974). Cargo Cult Science. Engineering and Science, 37(7), 10-13. Caltech Archives
Goffman, E. (1959). The Presentation of Self in Everyday Life. New York: Anchor Books. ISBN: 978-0-385-09402-3. WorldCat OCLC 59624115 | Internet Archive | Open Library
Haney, C., Banks, C., & Zimbardo, P. (1973). Interpersonal Dynamics in a Simulated Prison. International Journal of Criminology and Penology, 1, 69-97. PsycNET
Janis, I.L. (1982). Groupthink: Psychological Studies of Policy Decisions and Fiascoes. 2nd ed. Boston: Houghton Mifflin. ISBN: 978-0-395-31704-4. WorldCat OCLC 8597619
Kruger, J. & Dunning, D. (1999). Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments. Journal of Personality and Social Psychology, 77(6), 1121-1134. DOI: 10.1037/0022-3514.77.6.1121 | PubMed | PsycNET
Merton, R.K. (1936). The Unanticipated Consequences of Purposive Social Action. American Sociological Review, 1(6), 894-904. DOI: 10.2307/2084615 | JSTOR
Meyer, J.W. & Rowan, B. (1977). Institutionalized Organizations: Formal Structure as Myth and Ceremony. American Journal of Sociology, 83(2), 340-363. DOI: 10.1086/226550 | JSTOR
Milgram, S. (1974). Obedience to Authority: An Experimental View. New York: Harper Perennial. ISBN: 978-0-06-176521-6. WorldCat OCLC 668026 | Internet Archive
Vaughan, D. (1996). The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. Chicago: University of Chicago Press. ISBN: 978-0-226-85175-4. WorldCat OCLC 34050480 | Open Library
Zimbardo, P.G. (2007). The Lucifer Effect: Understanding How Good People Turn Evil. New York: Random House. ISBN: 978-1-4000-6411-3. WorldCat OCLC 71813017 | Open Library
Prestige Duck Protocol¶
What follows is a summary in protocol form. It is not part of the academic analysis.
They are not evil. It's worse. They are sincere.
The Zealot does more. Because he wants to belong.
The Proceduralist does exactly what he is told. Because the procedure protects him.
The Protector causes harm. Because he believes he is doing good.
None of them understands the mechanism. All of them execute it perfectly.
The runway is impeccable. The fires are lit. The controller is in place. The bamboo bars point toward the sky.
No planes land.
But the runway makes noise. And the noise causes damage. And the damage is real. Even if the planes are not.
Feynman saw them in the laboratories. Asch saw them in the experiment rooms. Zimbardo saw them in a simulated prison. Bandura saw them everywhere.
Normal people. Terribly, terrifyingly normal.
We do not accuse them. We describe them.
With Feynman, 1974. With Bandura, 1999. With Vaughan, 1996.
A cargo cult that is named loses its power. An operator who is described can recognize himself. A mechanism that is published can no longer be invisible.
And the one who documents? Not a prophet. A technician.
Who cites your own manuals.
Pattern by pattern. Reference by reference. Method by method.
COGSEC — Article 007 Prestige Duck Protocol "You cannot discredit someone who cites your own manuals."
🧠🦆
This article is part of the Cognitive Security project — CogSec.
Coming Next¶
COGSEC008: The Voluntary Autopsy — When the System Produces Its Own Evidence
PGP Verification