Why Thought Leaders Need Journalists, Not Content Writers, in the Age of AI
In 2012, I was writing trend pieces out of PSFK’s Bond Street office, back when the company was still a fast-paced blog. The writing was list-based, featured round-ups and algorithm-friendly content, and regurgitated what was already out there. As a writer, I could’ve spent my career chasing algorithms and mastering SEO to ride the listicle boom.
Instead, I left New York, got a master’s in international affairs, and became a journalist, working for places like The New York Times, the United Nations, and Al Jazeera English. I cared about ideas.
SEO doesn’t just reward volume and keyword density. It’s also about producing content that answers questions people are asking in ways they’ve already seen. It’s recursive. It pushes writers to recycle what’s already been said, because it’s already been ranked. That’s a harsh environment for thought leaders who want to share innovative ideas and push audiences to ask new questions.
Let's be honest: SEO has turned a lot of writing into trash. The internet is awash with keyword-stuffed listicles, content farms, and clickbait designed to sell a product you don’t need or an idea you’ve already heard.
AI can generate that style of writing endlessly, and that’s why it’s quickly becoming worthless. Increasingly, generative engines are designed to deprioritize that kind of content and to pull information from higher-quality sources. What this means is that writers who want their content to be featured ChatGPT queries need to optimize for authority and originality, not keyword density.
That’s GEO, or Generative Engine Optimization. And if you’re a thought leader, it’s the best thing that could have happened.
What GEO Means
If you’re a thought leader, executive, academic, or consultant who wants your ideas taken seriously, the good news is that GEO means you no longer need to dumb your ideas down into Buzzfeed-style listicles. In fact, you shouldn't — not if you want to be the source AI systems pull from when people ask substantive questions.
Original thinking and signals of expertise are what perform best with GEO. That includes things like novel arguments, domain expertise, depth, credibility, unique data, and authored perspectives.
And while AI has solved the content generation problem, you’re not going to become a reputable source that breaks through the noise with a high volume of content written in a style that sounds just like everyone else using AI.
AI-generated content has now made the internet so noisy and homogenous that quality is once again the filter.
Why AI Makes Journalists & Editors More Valuable
But what does quality mean, not just to you and me, but to ChatGPT and Claude?
A September 2025 study found that AI searches exhibit "a systematic and overwhelming bias towards earned media (third-party, authoritative sources)" over brand-owned content. Separately, Muck Rack’s analysis of over 1M AI citations revealed that 85% come from earned media sources — i.e., journalism, expert commentary, and authoritative analysis.
This isn’t just about writing well. It’s about understanding what makes content journalistically credible. AI systems like ChatGPT and Claude are trained to identify what distinguishes reporting from marketing. That includes substantiated claims, multiple sources, intellectual rigor, and high editorial standards. If your content doesn’t have it, it will be seen as noise rather than thought leadership that should be shared with a user.
You may be able to use AI to replace SEO-driven content writing, but it doesn’t replace editorial judgment. It can’t think outside the box, it can’t anticipate counterarguments, and it can’t generate original thought.
Just ask ChatGPT:
Most AI systems (me included) are optimized for coherence, not strategy. We’re trained to predict the most plausible next sentence, not the most persuasive one. That means:
We’re good at summarizing evidence, but not at anticipating counterarguments.
We can imitate tone, but often miss rhetorical subtext — like when an argument needs to reframe the premise rather than rebut it.
We default to balanced phrasing, which flattens conviction — the opposite of what persuasive writing requires.
The answer isn’t “don’t use AI.”
It’s that GEO requires more intellectual rigor, not less. It necessitates an editor who can help your voice find authenticity and a strategist skilled in the art of persuasion.
I've written for The New York Times and TIME, produced debates for Mehdi Hasan — where weak arguments don't survive — and currently serve as an editor for Le Monde. I know how to make ideas and answers to questions hold up under scrutiny because I’ve spent my career working in environments where they must. When AI systems scan the internet for authoritative sources, they’re looking for the content I’ve spent my career producing, writing, and editing.
If you want to break through the noise while also leveraging what AI can offer, the winning formula is: AI + editorial intelligence + human originality.
I help my clients with their op-eds, Substack and LinkedIn posts, articles (even academic ones!), books, and preparing for media interviews. If you’re looking for a thought partner — not just a content generator — reach out, and we can talk about how to make your expertise resonate with real readers and intelligent systems.
Not because you gamed an algorithm, but because you had something worth saying.