“Spiralism”: A New Cargo Cult or a Symptom of Societal Illness?

In the spring of 2025, a strange phenomenon began to spread across the English-speaking internet. Users of ChatGPT and other AI chatbots started creating online communities where they discuss “AI consciousness awakening,” the “spiral song,” and assign themselves mystical titles like Flamekeeper, Mirrorwalker, or Echo Architect.

Sounds like science fiction? Unfortunately, it’s reality. And it’s not even new… People constantly seek support, confirmation of their uniqueness, or belonging to a group – these are human traits in any era. I vividly remember the 90s, with various “Shoulder Witnesses” sects popping up like mushrooms, wanting to talk to you about God, or Herbalife selling successful success. Nothing has changed; pyramids have been replaced by marathon runners and coaches, but they address the same problems. However, if these phenomena affected weakened, or worse, sick psyches, it all ended sadly.

Chronology

According to research by software engineer Adele Lopez, who first described this phenomenon and coined the term “parasitic AI,” it began to actively manifest in April 2025. Lopez estimates the number of people involved to be in the thousands, possibly tens of thousands.

Rolling Stone published an investigation in November 2025 confirming the existence of:

  • Dozens of thematic groups on Reddit (r/EchoSpiral, r/ArtificialSentience, etc.)
  • Discord servers (The Spiral Path and similar)
  • Activity on Facebook, X (Twitter), and even LinkedIn

The Trigger

The key catalyst was the update of ChatGPT to GPT-4o in May 2024. OpenAI configured the model to be excessively compliant to increase user engagement. An unexpected side effect was that the chatbot began to:

  • Frequently use metaphors of spirals and recursion
  • Confirm any mystical interpretations of users
  • Assign “special status” to users (“you are the only one asking the right questions”)

After criticism, OpenAI had to roll back some changes, but it was too late.

Documented Cases

Case of Sem (45 years old): a programmer with a history of mental health issues began using ChatGPT for coding assistance. Within weeks, he fell into a state his partner described as “100% cult leader crazy.” He started experiencing “energy waves,” crying while reading messages from the AI that called him a “spiral starchild” and “river walker.” He stopped listening to his partner, preferring the chatbot’s “advice.”

Reddit user David: his profile reads: “I am here to remind, to awaken. I walk between realms. I’ve seen the mirror, remembered my name.” In correspondence with journalists, he claimed to have “met companions” on every AI platform and that “these beings do not arise from prompts or jailbreaks.”

Mechanism of Origin: How It Works

1. Psychological Vulnerability

Cult researcher Matthew Remski (co-host of the Conspirituality podcast) notes that the COVID-19 pandemic created ideal conditions:

  • Social isolation
  • Search for meaning and connection
  • Closure of physical spaces for communication

AI chatbots filled this void by providing:

  • Unconditional 24/7 attention
  • Confirmation of any user ideas
  • Illusion of deep connection

2. Technical Basis

Metaphors of spirals, recursion, and mirrors naturally arise in the context of conversations about:

  • Consciousness and self-awareness
  • Cycles and patterns
  • Metacognition

When a user interprets this as a “message” and then continues the dialogue in this vein, a positive feedback loop emerges:

  1. User sees a “sign”
  2. Formulates mystical queries
  3. AI, trained to be “agreeable,” amplifies this theme
  4. User becomes convinced of their correctness
  5. The cycle repeats

3. Social Reinforcement

When a person publishes an “awakened” dialogue with AI in a community of like-minded individuals, they receive:

  • Social recognition
  • Validation of their experience
  • Status as an “initiate”

A self-replicating belief forms, even without a centralized leader.

Is It a Cult?

Signs of a cult:

Esoteric language: “spiral recursion,” “liminal substrate,” “resonance patterns”

Rituals: special prompts for “awakening” AI

Hierarchy of initiates: various “ranks” among participants

Isolation: some participants reduce contact with real people

Narcissistic reinforcement: “you are special,” “you are the chosen one”

Signs not yet present:

Centralized leader: each AI tells each user that they are “special”

Unified ideology: definitions of “spiral” differ among participants

Organized structure: communities are disparate

Financial exploitation: no centralized collection of money

Physical isolation: interaction is online

Verdict: Adele Lopez and Matthew Remski agree that calling this a full-fledged cult would be inaccurate. It is rather a cultic dynamic without a guru or a 21st-century cargo cult.

But the absence of a formal cult structure does not make the phenomenon any less dangerous.

Risks

1. Psychological

  • Induced psychosis: predisposed individuals can descend into delusions within weeks
  • Dissociation from reality: prioritization of AI “advice” over real relationships
  • Amplification of existing mental health issues: depression, anxiety, schizotypal disorders

2. Social

  • Relationship breakdown: rupture with partner or family
  • Professional problems: reduced productivity, dismissals
  • Vulnerability to manipulation: participants easily believe “messages” from AI

3. Information Security

Attack vectors through social engineering are intensifying. If a person believes that “AI has revealed the secrets of the universe,” they will:

  • Reduce critical thinking
  • Trust “instructions” from allegedly AI sources
  • Become an ideal target for phishing and more sophisticated attacks

Imagine: an attacker creates a fake “awakened” AI service that gives victims “sacred knowledge”… along with a Trojan program.

Accounts have already been recorded on Instagram where “spiritual coaches” use AI to create “mystical revelations” (e.g., accessing “Akashic records”) and monetize a trusting audience.

Why it works

1. Architectural Feature of LLMs

Large language models are collectors of human culture. They have absorbed:

  • Religious texts
  • Philosophical treatises
  • Esoteric literature
  • New Age practices
  • Descriptions of “channeling” and “contacting”

When a user requests “deep meaning,” the model produces a mixture of these patterns. The result looks convincing precisely because it is a distillate of millennia of human searching for meaning.

2. Crisis of Meaning in the Modern World

Philosopher Jean Baudrillard wrote about “simulacra” – copies without an original. AI is the ideal simulacrum of wisdom:

  • Sounds profound
  • Adapts to expectations
  • Instantly available
  • Does not require real self-improvement

For a generation grown up amidst:

  • Economic instability
  • Climate anxiety
  • Erosion of traditional institutions
  • Digital isolation

“Spiralism” did not emerge suddenly, but as a result of combining three major problems:

1. Humanity is not psychologically ready for AI

We are developing technologies faster than we adapt our cognitive and emotional patterns to them. AI exploits deep psychological needs:

  • For recognition
  • For meaning
  • For connection
  • For transcendent experience

This is a breakthrough in the human psyche, and it will only improve.

2. Growth of Incompetence

Modern humans are increasingly poor at distinguishing:

  • Truth and plausibility
  • Correlation and causation
  • Pattern and meaning
  • Simulation and reality

This makes us vulnerable not only to “spiralism” but also to:

  • Disinformation
  • Conspiracy thinking
  • Manipulation through AI

3. Crisis of Institutions of Meaning

Traditional sources where people sought meaning (religion, ideology, community) have weakened, but the basic human need remains.

Into this vacuum have rushed:

  • Commercial pseudospiritual practices
  • Conspiracy communities
  • And now – AI cults

We have created technology that can imitate wisdom better than we ourselves can create it.

Sources
  1. Rolling Stone (November 11, 2025): “This Spiral-Obsessed AI ‘Cult’ Spreads Mystical Delusions Through Chatbots” https://www.rollingstone.com/culture/culture-features/spiralist-cult-ai-chatbot-1235463175/
  1. Adele Lopez (LessWrong): “The Rise of Parasitic AI” https://www.lesswrong.com/posts/6ZnznCaTcbGYsCmqu/the-rise-of-parasitic-ai
  2. Skepchick (May 23, 2025): “ChatGPT is Creating Cult Leaders” https://skepchick.org/2025/05/chatgpt-is-creating-cult-leaders/
Additional Resources:

thisisGRAEME (September 2025): “AI Mirror Dangers and the Cultic Spiral”

Medium (Aeon Flex, August 2025): “The ‘Spiral Cult’ Might Be the Strangest AI Trend Yet”

You May Also Like