A live GEO audit of inforum.com. 60 seconds. Score: 41/100. The fix on the #1 finding is one line of text. Join us for our live event to see your score: https://luma.com/agxw7yz2
InForum is the daily paper of record for Fargo-Moorhead. Real reporters. Real datelines. Real sources on the ground. When something happens in North Dakota, they're who covers it.
They're also completely invisible to ChatGPT.
Ask Claude what happened at last night's Fargo city council meeting and it cannot cite them. Not because the article doesn't exist — it does, with a named byline, a dateline, three sources quoted by title. The article just can't be retrieved. InForum has blocked every AI crawler on the planet.
We found that in 60 seconds. Tomorrow's GEO: SEO for AI workshop runs this same audit live on your site. Save your seat →
🎥 The Live Audit
See the results in real time and get a peek at what you will walk away with:
Composite Score: 41/100. Grade: D.
We pointed Chipp's GEO Audit Agent at inforum.com. The "agent" is actually a team — Visibility Analyst, Platform Optimizer, Technical SEO Auditor, Content & E-E-A-T Analyst, Schema Architect, Report Agent — all led by a GEO Strategist.
Three findings came back. Any one of them would hurt. Together, they explain why InForum loses AI-referred traffic every single day.
Finding #1 — Every AI Crawler Is Blocked. Every One.
InForum's robots.txt explicitly disallows 100+ AI user-agents: GPTBot, ClaudeBot, PerplexityBot, Google-Extended (which powers AI Overviews), Gemini-Deep-Research, OAI-SearchBot, Meta-ExternalAgent, MistralAI, DeepSeekBot, and dozens more — blanket Disallow: /.
This isn't a training-data opt-out. It's a total citation blackout.
No AI engine can index, retrieve, or cite a single page on inforum.com. When a Fargo resident asks ChatGPT "what happened at the council meeting last night," InForum cannot be the answer — even if InForum is literally the only place that answer exists.
The fix is one line. Delete a few rules in robots.txt. Highest-ROI action in all of GEO, and it's free.
Finding #2 — No llms.txt. 404.
inforum.com/llms.txt doesn't exist. Zero structured AI content guidance. No curated index of coverage areas, reporter roster, or authoritative work.
Even if the crawler block dropped tomorrow, AI engines would walk in without a map. A 2–4 hour draft would establish topical authority for ND/MN news at zero ongoing cost.
Finding #3 — The Journalism Is Great. The AI Can't Reach It.
Here's the frustrating part. Individual InForum articles score 78–82/100 for AI citability:
- Named bylines with author bios
- Geographic datelines (DILWORTH —, BISMARCK —)
- Precise timestamps and specific data points
- Named sources with titles
- Fact-dense, self-contained passages
That's exactly what AI engines cite. The reporters are doing the work. The paywall plus the crawler block lock it all behind two doors — and no AI can open either one.
Great content. Zero distribution to the fastest-growing channel in marketing.
This Isn't an InForum Problem
AI-referred traffic is up 527% year over year. It converts 4.4x higher than organic search. Gartner projects traditional search drops 50% by 2028. Only 23% of marketers are doing anything about it.
Most sites we audit have at least one of InForum's three issues. Plenty have all three.
The gap between "cited by ChatGPT" and "invisible to ChatGPT" is almost never a content problem. It's a configuration problem most teams don't know they have.
What Tomorrow's Workshop Covers
Thursday, April 23, 2026 — 90 minutes live. GEO: SEO for AI.
You'll leave with:
- A live audit of your site. Same 60-second audit we ran on InForum. On the call. Your URL.
- The GEO Audit Agent template inside Chipp. The 6-specialist team above, yours to deploy on any site on demand.
- The agency sales playbook. How agencies are pricing GEO retainers at $2K–$12K/month — discovery scripts, cold email templates, objection handling.
- Platform-specific optimization guides. What ChatGPT cites vs. Perplexity vs. Google AIO. Only 11% overlap.
Who it's for:
- Marketing leads and in-house SEO teams — add GEO to your 2026 strategy before Q3.
- Agencies and consultants — add a high-margin service line your clients don't know they need yet.
- Founders — most of your competitors aren't optimizing for AI search yet. This is your window.
Run the Audit on Your Own Site
Easiest place to start: open your own robots.txt right now. Search for GPTBot, ClaudeBot, or Google-Extended. If any of those have a Disallow, you're leaving AI-referred traffic on the table the same way InForum is.
Then check yourdomain.com/llms.txt. If it's a 404, that's finding #2.
Fixing both takes an afternoon. Knowing the other 40 things to fix takes a workshop.
