
Spylert is a free competitive intelligence tool that analyzes any website in about 30 seconds. Users paste a competitor’s URL and the system automatically discovers the site structure, scrapes key pages — pricing, features, changelog, blog, RSS feeds, GitHub releases — and generates an AI-powered executive summary with actionable insights.
The processing pipeline is orchestrated through n8n webhooks. When an audit is initiated, the Next.js backend fires a webhook to n8n, which coordinates the discovery and scraping phases via the Firecrawl API.
Discovered content is then analyzed by an AI model to extract structured data — pricing tiers, feature categories, changelog entries with impact levels, and publication frequency. The compiled report is stored as JSONB in Supabase PostgreSQL and made available through a tabbed interface.
Reports are gated behind email verification using time-limited access tokens that expire after 24 hours. The system includes intelligent caching — identical URLs return cached results within a 24-hour window — along with per-IP rate limiting.
A real-time polling interface shows users the processing stages as they happen: discovering, scraping, analyzing, compiling, and completed.
The free audit tool is the first piece of a larger vision. I’m currently building out the full Spylert SaaS product — a platform for tracking multiple competitors simultaneously with automated daily crawls, change detection, and alerts when a competitor updates their pricing, ships new features, or publishes new content.