How to Add llms.txt to Your Website (Step-by-Step Guide)

llms.txtAI agentstutorial

March 18, 2026

    # 
      How to Add llms.txt to Your Website (Step-by-Step Guide)
    


    
      AI agents are crawling the web differently than Google bots do. While search engines
      index everything and sort it out later, AI assistants like ChatGPT, Claude, and
      Perplexity are actively looking for structured signals that tell them what your site is
      about and what content they are allowed to use.
    


  

  
    
      Enter `llms.txt` —
      a plain-text file that lives at the root of your website, purpose-built for large
      language models. Think of it as 
      `robots.txt`, but instead
      of telling bots where not to go, it tells AI agents exactly what your content is,
      where the good stuff lives, and how it should be understood.
    


    
      This guide walks you through everything: what llms.txt is, why it matters in 2026,
      and the exact steps to add one to your site today.
    



    ## 
      What Is llms.txt?
    


    
      The `llms.txt` standard
      was proposed in late 2024 and has gained rapid adoption as AI-powered search and agents
      have gone mainstream. The idea is simple: as LLMs browse the web on behalf of users,
      they need a reliable, low-noise signal about your site's content and purpose.
    


    
      A web page full of navigation menus, cookie banners, sidebar ads, and footer links is
      noisy for an AI to parse. 
      `llms.txt` cuts through
      that — giving a clean, structured summary of what your site offers and linking to
      the most important content in markdown format.
    


    
      It lives at 
      `https://yourdomain.com/llms.txt` 
      and follows a straightforward markdown-based format.
    



    ## 
      Why Does It Matter?
    


    
      Here is what has changed: AI agents are becoming the first layer of discovery for
      millions of users. When someone asks ChatGPT “what's the best tool for
      X?” or uses Perplexity to research a purchase, those systems are crawling the web
      in real time. If your site does not have clear signals for AI agents, you risk being
      misunderstood, misrepresented, or skipped entirely.
    


    Beyond search, think about:


    
      - **AI coding assistants** that pull in documentation from your site

      - **Autonomous agents** performing research tasks on behalf of users

      - **RAG pipelines** that index your content to answer customer questions

      - **AI browser agents** that need to understand your site's structure before interacting with it

    

    
      Sites that speak the language of AI agents in 2026 will have a compounding advantage
      — showing up more accurately, more often, in more AI-powered contexts.
    



    ## 
      The llms.txt Format Explained
    


    
      Before writing your file, let's understand the structure. An 
      `llms.txt` file is
      written in Markdown and follows this anatomy:
    


    

Your Site or Company Name

A one or two sentence description of what your site/product does. This is the most important part — it's what the AI reads first.

Optional additional context paragraphs go here.

Section Name

Another Section

Optional

  • Terms of Service: Optional.

  • Privacy Policy: Optional.

      Key rules:
    
    
      
        - **Line 1:** An H1 with your site/company name
    
        - **The blockquote (>):** A concise description — this is the highest-signal section
    
        - **Sections (H2):** Logical groupings of your content
    
        - **List items:** Markdown links with a brief colon-separated description
    
        - **“Optional” section:** Anything that is useful but lower priority for AI consumption
    
      
    
    
      ## 
        Step-by-Step: Adding llms.txt to Your Website
      
    
    
    
      ### 
        Step 1: Plan Your Content Map
      
    
    
      
        Before writing a single line, open a blank document and answer these questions:
      
    
    
      
        - What does my site/product actually do in one sentence?
    
        - What are the 5–10 most important pages an AI should know about?
    
        - How should those pages be grouped?
    
      
    
      
        For a developer tool, your sections might be: Docs, API Reference, Guides, Blog. For an
        e-commerce site: Products, Categories, About. For a SaaS: Features, Pricing,
        Documentation, Support.
      
    
    
    
      ### 
        Step 2: Write Your llms.txt File
      
    
    
      Here is a real-world example for a developer documentation site:
    
    
      
    

Acme Dev Tools

Acme Dev Tools provides a REST API and SDKs for adding real-time collaboration features to web applications. Trusted by 12,000+ developers.

Our tools handle presence indicators, cursor sharing, live cursors, and conflict-free document editing (CRDT-based). We have SDKs for JavaScript, Python, Go, and Rust.

Getting Started

  • Quick Start Guide: Install the SDK and build your first real-time feature in under 10 minutes.
  • Core Concepts: Understand rooms, presence, and document sync.
  • Authentication: How to authenticate users and issue JWT tokens.

API Reference

  • REST API: Full REST API reference with request/response examples.
  • JavaScript SDK: Complete JavaScript/TypeScript SDK reference.
  • Webhooks: Configure webhooks to react to room events.

Guides

Pricing & Plans

  • Pricing: Free tier, Pro ($49/mo), and Enterprise plans.

Optional

  • Status Page: Real-time API uptime and incident history.

  • Changelog: What's new in each release.

  • GitHub: Open source SDK repository.

      ### 
        Step 3: Create the File
      
    
    
      
        Save your content as a plain text file named 
        `llms.txt`. No HTML, no
        special encoding — just UTF-8 plain text written in Markdown format.
      
    
    
    
      ### 
        Step 4: Serve It at the Right URL
      
    
    
      
        The file must be accessible at 
        `https://yourdomain.com/llms.txt`.
        How you do this depends on your stack:
      
    
    
    
      #### 
        Static Sites (Netlify, Vercel, GitHub Pages)
      
    
    
      
        Simply place `llms.txt` in
        your `public/` or root
        directory. It will be served automatically.
      
    
    
    
      #### 
        Next.js
      
    
    
      
        Place it in your 
        `/public` directory —
        Next.js serves everything in 
        `/public` as static files.
      
    
    
    
      #### 
        Express.js
      
    
    
      
    

const express = require('express'); const path = require('path'); const app = express();

// Serve llms.txt as static app.use(express.static(path.join(__dirname, 'public')));

// Or serve it explicitly app.get('/llms.txt', (req, res) => );

    #### 
      WordPress
    


    
      Upload `llms.txt` via FTP
      or your hosting file manager to the root of your WordPress installation (same directory
      as `wp-config.php`).
    



    #### 
      nginx
    


    

server }

    ### 
      Step 5: Verify It Is Live
    


    
      Open your browser and navigate to 
      `https://yourdomain.com/llms.txt`.
      You should see your plain text file. If you see a 404, double-check your file placement
      and server config.
    


    You can also run a quick curl check:


    

curl -I https://yourdomain.com/llms.txt

Should return: HTTP/2 200

Content-Type: text/plain

    ### 
      Step 6: Add the llms-full.txt Variant (Optional but Recommended)
    


    
      The `llms.txt` standard
      also supports a companion file: 
      `llms-full.txt`. While 
      `llms.txt` is a
      directory/index (links to content), 
      `llms-full.txt` contains
      the actual full content of your most important pages, pre-processed into clean markdown.
    


    
      This is especially powerful for documentation sites, where you want AI agents to have
      the complete content without having to crawl each page individually.
    



    ## 
      Common Mistakes to Avoid
    


    
      - 
        **Listing every single page.** Keep it to your 10–20
        most important pages. AI agents do not need your tag pages, author archives, or
        paginated lists.
      

      - 
        **Vague descriptions.** “About page” is useless.
        “Company background, team, and mission statement for Acme Dev Tools” is
        useful.
      

      - 
        **Forgetting to update it.** Set a reminder to review your 
        `llms.txt` quarterly.
        When you add major features or new docs sections, update the file.
      

      - 
        **Wrong content type.** Serve it as 
        `text/plain`, not 
        `text/html`. Some AI
        crawlers are picky about this.
      

      - 
        **Skipping the blockquote description.** The 
        `>` description is
        the highest-weight signal in the file. Do not leave it out or make it generic.
      

    


    ## 
      What Comes After llms.txt?
    


    
      `llms.txt` is a great
      start, but it is one piece of a larger AI-readiness puzzle. To be fully optimized for
      the way AI agents interact with websites in 2026, you also need:
    


    
      - **Structured data / JSON-LD** so AI agents understand your content type and entities

      - **Clean semantic HTML** that parses well without a browser

      - **Descriptive alt text** on images (AI vision agents read these)

      - **Machine-readable pricing and feature data**

      - **A well-formed robots.txt** that does not accidentally block AI crawlers

    

  

  
    
      Ready to see how AI-ready your site actually is?
    


    
      AgentReady runs a full AI-readiness audit on your website in seconds. It checks for
      llms.txt, structured data, semantic HTML quality, robots.txt configuration, and a dozen
      other signals that AI agents look for when they crawl your site.
    


    [
      Scan Your Website Free
    ](/)
  

  
    [
      ← Back to Blog
    ](/blog)