The federal dev background
Before CapitalAI, I spent over a decade as a web developer for the Government of Canada. That meant building bilingual, fully accessible public-facing services that had to work for millions of users — no excuses, no shortcuts. WCAG compliance, bilingual architecture, structured data, and technical performance weren't optional extras. They were the baseline.
Most SEO agencies in Ottawa have marketers who learned technical SEO from blog posts. I learned it by shipping production federal systems. The difference shows in the audit findings — I catch things others don't look for.
Why I built CapitalAI
The SEO industry changed in 2024. Google's AI Overviews, Perplexity's answer engine, and ChatGPT's browsing mode started routing search intent directly — no click required. Businesses that get cited in these AI responses capture the intent. Businesses that don't are invisible, even if they rank on page one of traditional Google.
Ottawa has two well-established SEO agencies: ottawaseo.com and seoplus.ca. Neither of them audits for AI citation visibility. Neither checks whether your business appears when someone asks Perplexity for the best physiotherapist in Kanata, or the top dance school in Orleans. I built an engine to check that — and to fix it.
How the audit engine works
The entire pipeline runs on my local machine — a Windows workstation with an RTX 4090 running Ollama for local language model inference. No client data enters any cloud API. Your website content, competitor data, and audit results never leave my machine.
The stack: open-source crawler (Crawl4AI + Playwright), local AI model (llama3.1:8b via Ollama), headless Perplexity citation checker, and a Flask control panel. Each audit crawls up to 150 pages of your site, up to 60 pages per competitor, and runs 5–6 AI citation queries targeting Ottawa and your specific neighbourhood.
My working principles
Privacy-first, always
Your client data never leaves my machine. Zero cloud APIs in the core pipeline. This isn't a marketing claim — it's an architectural constraint.
Findings from actual crawl data
Every score, every gap, every recommendation in your report comes from real crawl data — not generic templates or industry averages.
Human review at every gate
The AI drafts. I verify. Nothing leaves as a client deliverable without a human sign-off on every finding. No hallucinations shipped.
Bilingual NCR focus
Ottawa and Gatineau are a bilingual market. I audit and build for both English and French audiences — a gap most Ottawa SEO firms ignore entirely.
Background timeline
2010–2015
Federal web developer — Government of Canada
Built bilingual, accessible public-facing digital services. Learned structured data, technical performance, and WCAG compliance from the ground up.
2015–2024
Senior web developer — Ottawa
Continued building production web systems. Began applying SEO engineering principles to client projects. Watched AI search start to reshape how traffic flows.
2024
AI search changes everything
Perplexity, ChatGPT browsing, and Google AI Overviews go mainstream. Ottawa businesses start losing leads to AI answers citing their competitors. The gap is clear.
2025 →
Founded CapitalAI
Built a self-hosted AI SEO audit engine from scratch. First clients: Ottawa small businesses invisible to AI search. First finding: zero schema, zero citations, fixable in weeks.