Is this a starting place or should I optimize the other pages first?
=====
Where it appears in the book
Section “C. Indexability” → “Boosting AI Crawler Access With An llms.txt File”. The book introduces llms.txt as a new proposal to help AI crawlers better understand your site and improve recognition/indexing.
What the book says it is (and why)
A simple text file at your site root (https://yoursite.com/llms.txt) that acts like a “backstage pass” or roadmap for AI bots—giving them concise metadata and curated links so they grasp your site’s purpose and key resources. It’s presented as an easy, high-impact step to help AI systems “see and understand” your content and prioritize it in AI-driven results.
How the book says to create it
Format it in Markdown, broken into clear sections so crawlers know what they’re looking at:
H1 Title (site or project name)
Summary block (1–2 line site description)
Documentation links (e.g., Getting Started, API, Tutorials)
Optional resources (community forum, changelog, etc.)
The book even shows a mini layout (title, brief summary, docs list, optional resources).
Save & place it at the root as llms.txt; optionally maintain a longer companion file llms-full.txt.
Test accessibility by visiting the URL in a browser to confirm it loads and is readable to crawlers.
The book’s “why this matters”
Adding llms.txt “gives AI crawlers a clearer map” of your content and site structure, helping them prioritize your pages in AI search surfaces