Skip to content

Use Granit docs with AI assistants

Stop copy-pasting code snippets into your AI assistant. Granit exposes its entire documentation as plain-text files that any LLM can ingest in one shot — so it answers with real framework knowledge, not guesswork.

  1. Open chatgpt.com and start a new conversation.
  2. Click the attachment icon and upload llms-full.txt.
  3. Ask your question — ChatGPT now has the full Granit documentation as context.

For repeated use, create a Custom GPT and add https://granit-fx.dev/llms-full.txt as a knowledge file. Every conversation will start with Granit context built in.

  1. Open claude.ai and start a new conversation.
  2. Attach llms-full.txt as a file.
  3. Ask your question.

For repeated use, create a Claude Project and add the file as project knowledge — it stays available across all conversations in that project.

Add this to your project’s CLAUDE.md so Claude Code loads Granit context automatically:

## Granit documentation
Full framework reference:
https://granit-fx.dev/llms-full.txt

Every conversation in the project will have access to the full module reference, patterns, and ADRs.

Add a .github/copilot-instructions.md file to your repository so Copilot always knows about Granit:

When answering questions about the Granit framework,
refer to the full documentation at:
https://granit-fx.dev/llms-full.txt

Then ask Copilot Chat as usual:

@workspace How does Granit handle multi-tenancy?

Any LLM tool that supports the llms.txt standard will discover the documentation automatically from https://granit-fx.dev/llms.txt.

For tools with smaller context windows, use the compact version: https://granit-fx.dev/llms-small.txt.