Published: 2026-04-16

Forget Google SEO: How to Rank in ChatGPT, Claude and Perplexity

Craig Hewitt runs Castos.com — 400+ pages of content, strong Google rankings — but he's invisible in ChatGPT, Perplexity, and Claude. The reason: Google and LLMs evaluate and relay information differently. In this video he shares the seven concrete changes his team made to fix that, anchored by a free open-source Claude Code project called SEO Machine.

Source video

"Forget Google SEO: Rank in ChatGPT, Claude & Perplexity" by Craig HewittWatch on YouTube →

Key Takeaways

  • LLMs and Google read pages differently. Google rewards relevance signals spread across the whole document. LLM scrapers pull from the top of the page down and weight the first clear answer to a query most heavily. A great Google article can fail completely in LLM retrieval if the answer is buried under a narrative introduction.
  • The #1 fix: answer first. For any best/top/how query, the first one to two sentences must directly answer the question. "AI scrapers pull from the top to the bottom of the page. Don't bury the answer behind the narrative." A long ramp-up intro is a liability in LLM SEO even if it signals quality to human readers.
  • Craig shows a counterexample from his own site — an article that opens with "YouTube users watch 700 million hours of podcasts" — interesting context, but not an answer. An LLM evaluating that page would move on to a competitor that leads with the direct answer.
  • SEO Machine is the open-source Claude Code project Craig uses to execute this strategy at scale. It's a skeleton — input your business, product, and customer info, and it generates targeted content. The biggest improvement is the write command, which instructs the model to always answer the question before adding context.
  • This is practical SEO content strategy that applies to any site that wants LLM visibility: direct answers, structured information, question-first framing. The same principles that make content good for LLMs also make it better for humans who want fast answers.

The Core Shift: Answer First, Context Second

Traditional SEO content is often structured as: hook → context → narrative → answer. This works for Google, which can evaluate the whole document for relevance signals, and for humans who scroll to find information.

LLM retrieval is different: the scraper reads top-to-bottom and surfaces the first clear response to the query. If your answer is in paragraph five, after two paragraphs of background and one of statistics, you lose to any competitor whose answer is in paragraph one.

The rewrite rule Craig uses: "For any best, top, or how query — the first one to two sentences must answer the question." This is what the SEO Machine's write command enforces: it flags the answer-first rule as critical and instructs the model to never open with narrative context when an answer is possible.

SEO Machine — Free Claude Code Project

Craig's open-source SEO Machine project is a Claude Code skeleton designed for content at scale. You provide: information about your business, your product, and your target customers. The project's commands handle the rest — article writing, structure, internal linking patterns, and LLM-optimized formatting.

The project has been in development for 6–8 months with multiple major iterations. It's publicly available (link in the source video description). This is an example of Claude Code being used as the content production engine itself — not just for software development.

Related on OpenClawDatabase

← Back to News digest · See also: OpenClaw guide

📬 Weekly Digest — In Your Inbox

One email a week: top news, releases, and our deepest new guide. No spam. Same content via RSS if you prefer.