Back to AX Tools
Remediation Guides
Step-by-step instructions for fixing every finding ax-audit reports. Each guide covers one of the 9 weighted checks and explains exactly what to do — with code examples, configuration snippets, and context on why it matters for AI agents.
Robots.txt15%
Configure robots.txt for AI crawlers — explicit User-agent entries, Sitemap directives, and avoiding wildcard blocks.
LLMs.txt15%
Create a spec-compliant /llms.txt file with proper headings, blockquotes, sections, links, and an optional /llms-full.txt.
Structured Data13%
Add JSON-LD structured data with schema.org context, @graph arrays, key entity types, and BreadcrumbList navigation.
HTTP Headers13%
Set security headers, AI discovery Link headers, and CORS for .well-known resources.
Agent Card (A2A)10%
Create /.well-known/agent.json with required fields, skills, protocol version, and optional capabilities.
MCP (Model Context Protocol)10%
Create /.well-known/mcp.json with server metadata, tools with descriptions, resources, and CORS.
Security.txt8%
Create /.well-known/security.txt per RFC 9116 with Contact, Expires, and optional fields.
Meta Tags8%
Add AI meta tags, rel="alternate" links to llms.txt and agent.json, rel="me" identity links, and OpenGraph.
OpenAPI Spec8%
Create /.well-known/openapi.json with version, info, paths, and servers.