audit-website

Original🇨🇳 Chinese
Translated

Use the squirrelscan CLI (squirrel) to audit websites, covering over 140 rules in SEO, technical aspects, content, performance, security, etc. This skill applies when you need to analyze website health, troubleshoot technical SEO issues, check for broken links, verify meta tags and structured data, generate site audit reports, compare before and after website revamps, or when terms like 'website audit', 'audit website', 'squirrel', 'site health check' are mentioned.

1installs
Added on

NPX Install

npx skill4agent add kunhai-88/skills audit-website

Tags

Translated version includes tags in frontmatter

SKILL.md Content (Chinese)

View Translation Comparison →

Website Audit Skill

Use the CLI tool
squirrel
from squirrelscan to audit websites for SEO, technical aspects, content, performance, and security.
squirrel supports macOS, Windows, and Linux. It simulates browsers and search crawlers, analyzes site structure and content with over 140 rules, and outputs issue lists along with repair suggestions.

Links

Rule documentation template:
https://docs.squirrelscan.com/rules/{rule_category}/{rule_id}

Example: https://docs.squirrelscan.com/rules/links/external-links

What This Skill Can Do

Supports AI agents to audit websites based on over 20 categories and 140+ rules, including:
  • SEO: Meta tags, title, description, canonical, Open Graph
  • Technical: Broken links, redirect chains, page speed, mobile-friendliness
  • Performance: Load time, resource usage, caching
  • Content: Heading structure, image alt text, content analysis
  • Security: Exposed secrets, HTTPS, security headers, mixed content
  • Accessibility: Alt text, color contrast, keyboard navigation
  • Usability: Form validation, error handling, user flows
  • Links: Broken internal and external link detection
  • E-E-A-T: Experience, expertise, authority, trustworthiness
  • Mobile: Mobile-friendliness, responsiveness, touch elements
  • Crawlability: robots.txt, sitemap.xml, etc.
  • Schema: Schema.org, structured data, rich snippets
  • Legal: Compliance with privacy policies, terms of service, etc.
  • Social: Open Graph, Twitter Cards, and schema validation
  • URL Structure: Length, hyphens, keywords
  • Keywords: Keyword stuffing detection
  • Images: Alt text, contrast, size, format
  • Local SEO: NAP consistency, geographic metadata
  • Videos: VideoObject schema, accessibility
The audit will crawl the site, analyze pages against rules, and generate a report including:
  • Overall health score (0–100)
  • Breakdown by category (Core SEO, Technical SEO, Content, Security, etc.)
  • Specific issues and affected URLs
  • Broken link list
  • Actionable improvement suggestions

When to Use

Use this skill in the following scenarios:
  • Analyze website health
  • Troubleshoot technical SEO issues
  • Fix the various issues mentioned above
  • Check for broken links
  • Verify meta tags and structured data
  • Generate site audit reports
  • Compare health scores before and after website revamps
  • Improve performance, accessibility, SEO, security, etc.

Prerequisites

This skill depends on the squirrel CLI, which must be installed and added to your PATH.

Installation (macOS / Linux)

bash
curl -fsSL https://squirrelscan.com/install | bash
This will:
  • Download the latest binary
  • Install to
    ~/.local/share/squirrel/releases/{version}/
  • Create a symlink at
    ~/.local/bin/squirrel
  • Initialize configuration at
    ~/.squirrel/settings.json
If
~/.local/bin
is not in your PATH, add this to your shell configuration:
bash
export PATH="$HOME/.local/bin:$PATH"

Windows Installation

PowerShell:
powershell
irm https://squirrelscan.com/install.ps1 | iex
This will download and install to
%LOCALAPPDATA%\squirrel\
and add it to your PATH. If using CMD, you may need to restart the terminal for the PATH changes to take effect.

Verify Installation

bash
squirrel --version

Configuration

Running
squirrel init
in your project directory will generate
squirrel.toml
.
Each project should have a unique project name (defaults to the audited site name) to distinguish multiple audits in the database:
bash
squirrel init --project-name my-project
Or:
bash
squirrel config set project.name my-project
If there is no
squirrel.toml
in the current directory, you must first run
squirrel init
and specify the project name with
-n
(can be inferred).
The project name is used for database identification and stored in
~/.squirrel/projects/
.

Usage

Overview

There are three subcommands, and all results are written to the local project database:
  • crawl: Execute or resume crawling
  • analyze: Analyze crawled results
  • report: Output reports in specified formats (llm, text, console, html, etc.)
audit
is a wrapper for the above three steps, executed in sequence:
bash
squirrel audit https://example.com --format llm
Prioritize using
--format llm
: A compact, complete output format designed for LLMs.

Selecting Audit Targets

  • If no URL is provided by the user: Infer possible sites from the current directory, environment variables (e.g., Vercel projects, memory, or references in code).
  • If a local dev server can be started in the current directory: Audit the local site.
  • If multiple auditable sites are found: Ask the user to select one.
  • If no sites can be inferred: Ask the user for the URL to audit.
Prioritize auditing live sites as they better reflect real performance and rendering issues. If both local and live sites are available, prompt the user to select and recommend the live site. After identifying issues in the live site audit, fix them in the local code.

When Implementing Fixes

  • Split large-scale fixes into parallel subtasks and use subagents to accelerate the process.
  • After fixes are complete, run typecheck/format tools (e.g., ruff, biome, tsc) on the generated code if available in the environment.

Basic Workflow

  1. Execute Audit (writes to database and outputs to terminal)
  2. Export Report in Specified Format
bash
# 1. Execute audit (default console output)
squirrel audit https://example.com

# 2. Export as LLM format
squirrel report <audit-id> --format llm

Common Options

Crawl more pages:
bash
squirrel audit https://example.com --max-pages 200
Ignore cache and force a full recrawl:
bash
squirrel audit https://example.com --refresh
Resume an interrupted crawl:
bash
squirrel audit https://example.com --resume
Detailed output for debugging:
bash
squirrel audit https://example.com --verbose

Command Options

audit

OptionAliasDescriptionDefault
--format <fmt>
-f
Output format: console, text, json, html, markdown, llmconsole
--max-pages <n>
-m
Maximum number of pages to crawl (max 500)500
--refresh
-r
Ignore cache and fully recrawlfalse
--resume
-Resume interrupted crawlfalse
--verbose
-v
Detailed outputfalse
--debug
-Debug logsfalse

report

OptionAliasDescription
--format <fmt>
-f
console, text, json, html, markdown, xml, llm

Output Formats

  • console (default): Colorful, readable output with progress.
  • llm: Compact XML/text hybrid designed for LLMs, saving about 40% more tokens than verbose XML, including:
    • Summary: Health score and core metrics
    • Issues categorized by rules (Core SEO, Technical, Content, Security, etc.)
    • Broken link list (internal + external)
    • Improvement suggestions sorted by priority

Examples

1. Quick Site Audit (LLM Output)

bash
squirrel audit https://squirrelscan.com --format llm

2. Deep Audit for Large Sites

bash
squirrel audit https://myblog.com --max-pages 500 --format llm

3. Re-audit After Website Revamp (Ignore Cache)

bash
squirrel audit https://example.com --refresh --format llm

4. Two-Step Workflow (Reuse Existing Audit)

bash
squirrel audit https://example.com
# Note the output audit-id, e.g., a1b2c3d4

squirrel report a1b2c3d4 --format llm

Output

After completing the audit and fixes, provide the user with a summary of all changes.

Troubleshooting

squirrel
command not found

  1. Install:
    curl -fsSL https://squirrelscan.com/install | bash
  2. Add to PATH:
    export PATH="$HOME/.local/bin:$PATH"
  3. Verify:
    squirrel --version

Permission Errors

bash
chmod +x ~/.local/bin/squirrel

Crawl Timeouts or Slow Speed

Large sites may take longer; use
--verbose
to check progress:
bash
squirrel audit https://example.com --format llm --verbose

Invalid URL

Always include the protocol (
http://
or
https://
):
bash
# ✗ Incorrect
squirrel audit example.com

# ✓ Correct
squirrel audit https://example.com

Workflow

  1. Crawl: Discover and fetch pages from the base URL
  2. Analyze: Execute audit rules on each page
  3. External Links: Check availability of external links
  4. Report: Generate an LLM-optimized report
Audit results are saved in the local database and can later be exported in different formats using
squirrel report
.

Additional Resources