audit-website
Original:🇨🇳 Chinese
Translated
Use the squirrelscan CLI (squirrel) to audit websites, covering over 140 rules in SEO, technical aspects, content, performance, security, etc. This skill applies when you need to analyze website health, troubleshoot technical SEO issues, check for broken links, verify meta tags and structured data, generate site audit reports, compare before and after website revamps, or when terms like 'website audit', 'audit website', 'squirrel', 'site health check' are mentioned.
1installs
Sourcekunhai-88/skills
Added on
NPX Install
npx skill4agent add kunhai-88/skills audit-websiteTags
Translated version includes tags in frontmatterSKILL.md Content (Chinese)
View Translation Comparison →Website Audit Skill
Use the CLI tool from squirrelscan to audit websites for SEO, technical aspects, content, performance, and security.
squirrelsquirrel supports macOS, Windows, and Linux. It simulates browsers and search crawlers, analyzes site structure and content with over 140 rules, and outputs issue lists along with repair suggestions.
Links
- Official Website: https://squirrelscan.com
- Documentation (including rule explanations): https://docs.squirrelscan.com
Rule documentation template:
Example: https://docs.squirrelscan.com/rules/links/external-links
https://docs.squirrelscan.com/rules/{rule_category}/{rule_id}Example: https://docs.squirrelscan.com/rules/links/external-links
What This Skill Can Do
Supports AI agents to audit websites based on over 20 categories and 140+ rules, including:
- SEO: Meta tags, title, description, canonical, Open Graph
- Technical: Broken links, redirect chains, page speed, mobile-friendliness
- Performance: Load time, resource usage, caching
- Content: Heading structure, image alt text, content analysis
- Security: Exposed secrets, HTTPS, security headers, mixed content
- Accessibility: Alt text, color contrast, keyboard navigation
- Usability: Form validation, error handling, user flows
- Links: Broken internal and external link detection
- E-E-A-T: Experience, expertise, authority, trustworthiness
- Mobile: Mobile-friendliness, responsiveness, touch elements
- Crawlability: robots.txt, sitemap.xml, etc.
- Schema: Schema.org, structured data, rich snippets
- Legal: Compliance with privacy policies, terms of service, etc.
- Social: Open Graph, Twitter Cards, and schema validation
- URL Structure: Length, hyphens, keywords
- Keywords: Keyword stuffing detection
- Images: Alt text, contrast, size, format
- Local SEO: NAP consistency, geographic metadata
- Videos: VideoObject schema, accessibility
The audit will crawl the site, analyze pages against rules, and generate a report including:
- Overall health score (0–100)
- Breakdown by category (Core SEO, Technical SEO, Content, Security, etc.)
- Specific issues and affected URLs
- Broken link list
- Actionable improvement suggestions
When to Use
Use this skill in the following scenarios:
- Analyze website health
- Troubleshoot technical SEO issues
- Fix the various issues mentioned above
- Check for broken links
- Verify meta tags and structured data
- Generate site audit reports
- Compare health scores before and after website revamps
- Improve performance, accessibility, SEO, security, etc.
Prerequisites
This skill depends on the squirrel CLI, which must be installed and added to your PATH.
Installation (macOS / Linux)
bash
curl -fsSL https://squirrelscan.com/install | bashThis will:
- Download the latest binary
- Install to
~/.local/share/squirrel/releases/{version}/ - Create a symlink at
~/.local/bin/squirrel - Initialize configuration at
~/.squirrel/settings.json
If is not in your PATH, add this to your shell configuration:
~/.local/binbash
export PATH="$HOME/.local/bin:$PATH"Windows Installation
PowerShell:
powershell
irm https://squirrelscan.com/install.ps1 | iexThis will download and install to and add it to your PATH. If using CMD, you may need to restart the terminal for the PATH changes to take effect.
%LOCALAPPDATA%\squirrel\Verify Installation
bash
squirrel --versionConfiguration
Running in your project directory will generate .
squirrel initsquirrel.tomlEach project should have a unique project name (defaults to the audited site name) to distinguish multiple audits in the database:
bash
squirrel init --project-name my-projectOr:
bash
squirrel config set project.name my-projectIf there is no in the current directory, you must first run and specify the project name with (can be inferred).
The project name is used for database identification and stored in.
squirrel.tomlsquirrel init-nThe project name is used for database identification and stored in
~/.squirrel/projects/Usage
Overview
There are three subcommands, and all results are written to the local project database:
- crawl: Execute or resume crawling
- analyze: Analyze crawled results
- report: Output reports in specified formats (llm, text, console, html, etc.)
auditbash
squirrel audit https://example.com --format llmPrioritize using : A compact, complete output format designed for LLMs.
--format llmSelecting Audit Targets
- If no URL is provided by the user: Infer possible sites from the current directory, environment variables (e.g., Vercel projects, memory, or references in code).
- If a local dev server can be started in the current directory: Audit the local site.
- If multiple auditable sites are found: Ask the user to select one.
- If no sites can be inferred: Ask the user for the URL to audit.
Prioritize auditing live sites as they better reflect real performance and rendering issues. If both local and live sites are available, prompt the user to select and recommend the live site. After identifying issues in the live site audit, fix them in the local code.
When Implementing Fixes
- Split large-scale fixes into parallel subtasks and use subagents to accelerate the process.
- After fixes are complete, run typecheck/format tools (e.g., ruff, biome, tsc) on the generated code if available in the environment.
Basic Workflow
- Execute Audit (writes to database and outputs to terminal)
- Export Report in Specified Format
bash
# 1. Execute audit (default console output)
squirrel audit https://example.com
# 2. Export as LLM format
squirrel report <audit-id> --format llmCommon Options
Crawl more pages:
bash
squirrel audit https://example.com --max-pages 200Ignore cache and force a full recrawl:
bash
squirrel audit https://example.com --refreshResume an interrupted crawl:
bash
squirrel audit https://example.com --resumeDetailed output for debugging:
bash
squirrel audit https://example.com --verboseCommand Options
audit
| Option | Alias | Description | Default |
|---|---|---|---|
| | Output format: console, text, json, html, markdown, llm | console |
| | Maximum number of pages to crawl (max 500) | 500 |
| | Ignore cache and fully recrawl | false |
| - | Resume interrupted crawl | false |
| | Detailed output | false |
| - | Debug logs | false |
report
| Option | Alias | Description |
|---|---|---|
| | console, text, json, html, markdown, xml, llm |
Output Formats
- console (default): Colorful, readable output with progress.
- llm: Compact XML/text hybrid designed for LLMs, saving about 40% more tokens than verbose XML, including:
- Summary: Health score and core metrics
- Issues categorized by rules (Core SEO, Technical, Content, Security, etc.)
- Broken link list (internal + external)
- Improvement suggestions sorted by priority
Examples
1. Quick Site Audit (LLM Output)
bash
squirrel audit https://squirrelscan.com --format llm2. Deep Audit for Large Sites
bash
squirrel audit https://myblog.com --max-pages 500 --format llm3. Re-audit After Website Revamp (Ignore Cache)
bash
squirrel audit https://example.com --refresh --format llm4. Two-Step Workflow (Reuse Existing Audit)
bash
squirrel audit https://example.com
# Note the output audit-id, e.g., a1b2c3d4
squirrel report a1b2c3d4 --format llmOutput
After completing the audit and fixes, provide the user with a summary of all changes.
Troubleshooting
squirrel
command not found
squirrel- Install:
curl -fsSL https://squirrelscan.com/install | bash - Add to PATH:
export PATH="$HOME/.local/bin:$PATH" - Verify:
squirrel --version
Permission Errors
bash
chmod +x ~/.local/bin/squirrelCrawl Timeouts or Slow Speed
Large sites may take longer; use to check progress:
--verbosebash
squirrel audit https://example.com --format llm --verboseInvalid URL
Always include the protocol ( or ):
http://https://bash
# ✗ Incorrect
squirrel audit example.com
# ✓ Correct
squirrel audit https://example.comWorkflow
- Crawl: Discover and fetch pages from the base URL
- Analyze: Execute audit rules on each page
- External Links: Check availability of external links
- Report: Generate an LLM-optimized report
Audit results are saved in the local database and can later be exported in different formats using .
squirrel reportAdditional Resources
- squirrelscan Documentation
- CLI Help:
squirrel audit --help