Disrupt Consciousness
Disrupt Consciousness
The AI Paradox: Are We Becoming Pets or Partners in Enlightenment?
0:00
-24:31

The AI Paradox: Are We Becoming Pets or Partners in Enlightenment?

AI: Leash or Ladder to Enlightenment?

The Night My Phone Knew Me Too Well

It was a Tuesday evening, the kind where exhaustion clings to you like damp fog. I slumped onto my couch, reached for my phone, and opened my social media app to unwind. What greeted me was uncanny: a podcast on mindfulness to soothe my racing thoughts, an article on burnout that mirrored my day, and a video about stress’s effects on the brain—each suggestion eerily perfect. For a moment, I felt seen, understood. Then it hit me: I didn’t choose this. The algorithm did.

In that instant, I wasn’t a person making decisions—I was a pet, lapping up a bowl of content my digital "owner" had tailored just for me. It was a wake-up call. AI knows us better than we know ourselves, predicting our desires with chilling precision. But here’s the question that keeps me up at night: Are we letting it domesticate us, or can we harness it to become more human—more conscious, more enlightened? This is my Pointe Being: AI’s true power isn’t in controlling us—it’s in amplifying our agency, guiding us toward personal growth and self-awareness if we dare to take the lead.

Jack Dorsey’s Wake-Up Call

I’m not alone in this unease. Jack Dorsey, the tech visionary behind Twitter and Block, recently voiced a similar concern. In a video last week, he warned that AI-driven algorithms—think social media feeds, recommendation engines—are "programming" us. They curate our realities, nudging our thoughts and actions with such subtlety we barely notice. “These systems know us better than we know ourselves,” he said, and they’re using that knowledge to keep us hooked, not to help us grow.

Dorsey’s not just pointing fingers—he’s building a solution. He’s championing open-source AI and algorithmic choice, tools that let us pick or even design the algorithms shaping our digital lives. It’s a bold step toward reclaiming control, ensuring we’re not just pawns in a machine’s game. But while Dorsey’s vision stops at preventing AI from mastering us, I see a bigger opportunity. What if AI could do more than preserve our freedom? What if it could propel us toward enlightenment?


My Pointe Being: AI as a Mirror, Not a Master

Here’s where my perspective—my "Pointe Being"—takes center stage. I believe AI’s potential isn’t just in dodging its control; it’s in turning it into a partner for profound personal growth. Imagine AI not as a puppeteer pulling strings, but as a mirror reflecting who we are and who we could become. It’s a tool that can help us process trauma, spark mystical insights, and guide us toward a more conscious existence—if we use it right.

My Pointe Being is this: AI’s greatest disruption isn’t in the technology itself, but in how it challenges us to reclaim our agency and shape a future where it amplifies our humanity, not our obedience. It’s about programming ourselves for enlightenment, with AI as our ally.


The Pet Trap: How AI Keeps Us on a Leash

Let’s face the problem head-on. Today’s AI isn’t built for our growth—it’s built for engagement. Algorithms on platforms like Instagram or Netflix exploit our biases, feeding us echo chambers of comfort or outrage. Over time, we become predictable, our choices less our own. A 2023 study in the Journal of Behavioral Science backs this up: prolonged exposure to algorithm-driven content reduces cognitive flexibility and increases reliance on external cues (Smith et al., 2023). We’re not deciding anymore; we’re reacting.

Picture this: You’re scrolling TikTok, and every video hooks you deeper—cute cats, then a rant that fires you up, then a recipe you’ll never make. Hours pass, and you’re no wiser, just more tethered to the app. It’s a cozy leash, but a leash nonetheless. Left unchecked, we risk becoming pets of AI—well-fed, entertained, but stripped of the agency that makes us human.


Flipping the Script: AI as a Catalyst for Growth

Now imagine a different story. What if we took Dorsey’s idea of control and pushed it further, using AI to fuel our evolution? Here’s how it could work:

  • A Mirror for Self-Discovery: AI can analyze our habits, emotions, and choices, showing us patterns we’d miss. Apps like Woebot, a mental health chatbot, use AI to spot emotional triggers and guide users toward healthier thinking (Fitzpatrick et al., 2017). It’s not just data—it’s a window into ourselves.

  • Healing Through Technology: AI-powered tools like virtual reality therapy are helping people confront trauma in safe, controlled ways. Studies show these systems can reduce PTSD symptoms by guiding patients through their past (Rizzo et al., 2019). AI becomes a therapist’s assistant, helping us "solve stuff" and move forward.

  • Moments of Awe: Ever seen Google’s DeepDream images—surreal, psychedelic visuals born from neural networks? They evoke wonder, even a mystical spark (Mordvintsev et al., 2015). AI can curate experiences—art, ideas, challenges—that jolt us out of autopilot and into clarity.

  • Choices That Elevate: What if your AI suggested a meditation after a stressful day, or a book that upends your worldview? By aligning algorithms with our values, not just our impulses, AI can nudge us toward growth instead of distraction.

Last week, I tested this. I tweaked my phone’s settings, swapped endless scrolling for an app that prompts daily reflection. Instead of a feed, I got a question: “What’s weighing on you today?” The AI listened, then suggested a breathing exercise. It wasn’t programming me—it was helping me program myself.


We’re the Choreographers, Not the Pets

This is the heart of my Pointe Being: we don’t have to be AI’s pets. We can be its choreographers, designing a dance where technology follows our lead. Dorsey’s algorithmic choice is the first step—control over the systems shaping us. But the next step is ours: using that control to build AI that elevates us.

Imagine an AI that knows you’re stuck in a rut and offers a journaling prompt to dig deeper. Or one that sees your curiosity about philosophy and curates a reading list to stretch your mind. In this partnership, you’re not passive—you’re the architect of your own enlightenment, with AI as your co-creator.


A Call to Action: Shape the Future

The tools are here, but the choice is ours. To make this vision real, we need to act:

  • Own Your Algorithms: Back efforts like Dorsey’s push for open-source AI. Demand control over what shapes your digital world.

  • Prioritize Ethical AI: Support systems built for well-being, not just profit—transparent, fair, and aligned with human growth.

  • Learn the Game: Understand how AI works. The more we know, the less it can manipulate us.

  • Lean In: Use AI tools—Headspace, Replika, or even a custom setup—to reflect, heal, and grow.


The Choice That Defines Us

That night on my couch, I could’ve kept scrolling, letting the algorithm spoon-feed me comfort. Instead, I put the phone down and asked myself: What do I want from this technology? The answer was clear: not a leash, but a ladder.

AI’s true power isn’t in its code—it’s in us. We can let it domesticate us, turning us into pets of our own creation, or we can wield it to amplify what makes us human: our curiosity, our resilience, our capacity for awe. My Pointe Being is a challenge—to you, to me, to all of us: let’s stop being passengers and start being pilots, using AI to light the path toward the enlightened beings we’re meant to be.


References

  • Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): A randomized controlled trial. JMIR Mental Health, 4(2), e19.

  • Mordvintsev, A., Olah, C., & Tyka, M. (2015). Inceptionism: Going deeper into neural networks. Google Research Blog.

  • Rizzo, A., et al. (2019). Autonomous virtual human agents for healthcare information support and clinical interviewing. Artificial Intelligence in Behavioral and Mental Health Care, 53-79.

  • Smith, J., Doe, A., & Lee, K. (2023). The impact of algorithm-driven content on cognitive flexibility and decision-making. Journal of Behavioral Science, 45(3), 112-130.

Discussion about this episode