There’s a parameter buried in Google’s leaked quality rater documentation called contentEffort. The definition is roughly: does this content demonstrate genuine editorial investment? Original research, custom visuals, detailed analysis, structured presentation — high score. Thin, templated, AI-generated-without-editing — low score.
When I first read that, I thought: that’s not really about effort. That’s about signal.
Here’s what I mean.
Effort Used to Be the Bottleneck
Back in 2003, writing a blog post was hard. You had to set up hosting, figure out some janky CMS, actually sit down and type something coherent, and then hope someone found it. The barrier was high enough that most people didn’t bother. So if something existed — if a post was out there — there was a decent chance the person who wrote it knew something worth knowing.
The effort was a proxy for credibility. Not a perfect proxy, but a real one.
Same thing happened with YouTube. Producing a decent interview video — equipment, scheduling, editing, publishing — takes real work. When I read a transcript of a good YouTube interview, I’m absorbing the output of all that effort. The signal is dense because the source material was expensive to create.
The pattern: Google’s contentEffort score is really asking “how expensive was this to fake?”
AI Just Broke the Old Equation
A blog post is now free to produce. I don’t mean that in a dismissive way — I use AI tools constantly. But the cost of generating words has dropped to approximately zero, which means the effort signal that words used to carry is gone.
This is why everyone’s freaking out about SEO right now. It’s not that Google changed the algorithm. It’s that the thing the algorithm was proxying for — real human effort and knowledge — can now be mimicked cheaply at scale.
So what becomes the next source of signal?
What Can’t AI Flatten?
The best way I’ve found to think about this: what requires real-world inputs that AI can’t synthesize?
Experience. Going somewhere. Doing something. Running a business for 15 years and having the scar tissue to prove it. AI can describe running a membership site business. It cannot have run one. The difference shows up in the details — the specific failure modes, the counterintuitive lessons, the things that only become obvious after you’ve done them wrong twice.
Primary research. Original data, surveys, experiments. The effort is in generating the data, not writing about it. A post that says “we looked at 10,000 membership sites and here’s what we found” is almost impossible to fake, because producing the underlying data requires real infrastructure and real work.
Relationships and access. An interview with someone hard to reach. An insider perspective from someone with actual stakes. AI can mimic the format of a CEO interview. It cannot get the 30-minute call.
Stakes. This one matters more than people realize. A doctor publishing a clinical experience. A founder writing an honest post-mortem about a real failure. The credibility comes from the fact that they have something to lose by being wrong or dishonest. AI has no skin in the game. That gap is signal.
What This Means Practically
If you’re building a content strategy right now — for your business, your personal brand, whatever — the question to ask isn’t “how do I produce more content faster.” It’s “what do I have access to that can’t be replicated?”
For me, that’s 15+ years of running Paid Memberships Pro: real customer stories, real revenue data, real product decisions with real consequences. Every time I write from that experience — even a short post, even a rough one — the signal density is different than anything generated from a prompt.
The effort bottleneck is moving. It used to be at “writing it down.” Now it’s at “having the thing to write down.”
The writers and businesses that figure that out early are going to have a real advantage. Not because they beat the algorithm. But because they’re building something the algorithm is trying to find: content that couldn’t have existed without a real human doing real things in the real world.
That’s always been what Google wanted. AI just made it harder to fake.