Let your AI assistant write your Gaffa code
Discover how to use Gaffa's LLMs.txt file to give your AI assistant up-to-date knowledge of the Gaffa API, so you can build faster with less back-and-forth.
Mar 18, 2026

AI assistants like ChatGPT or Claude can generate working code far more effectively when they have accurate, up-to-date context about an API. That's exactly what Gaffa's llms.txt file provides: a concise reference covering Gaffa's endpoints, actions, and code samples that you can drop directly into any AI assistant to get useful, accurate code from the very first prompt.
In this tutorial, we'll walk you through how to use the llms.txt file to build a complete Python script that interacts with the Gaffa API.
Step 1: Get the LLMs.txt File
Download or open the file at https://gaffa.dev/docs/llms-full.txt. Unlike a full documentation site, the file is structured specifically for AI assistants. It's compact enough to paste directly into a chat window and formatted so that a language model can quickly understand the API's shape, what each endpoint expects, and what it returns.
Step 2: Load It Into Your AI Assistant
Start a new chat with ChatGPT, Claude, or your preferred AI assistant, then paste the full contents of the file into the conversation. This gives the assistant accurate, up-to-date context about the Gaffa API before you ask it anything.
Note: If the file is too long to paste in one go, paste it in chunks. Label each chunk as "Part 1 of 3", "Part 2 of 3", and so on, and ask the assistant to acknowledge each one before you send the next. Once all parts are sent, let it know you're done and ask it to confirm it has the full context. After that, you're ready to start building.
Step 3: Ask the Assistant to Write Your Script
Once the assistant has the context loaded, you can ask it to build scripts for you. For example:
"Write me a Python script that uses Gaffa's browser API to convert a page into Markdown and save the output file locally."
Because the assistant already has the full API context, it can produce accurate code without you needing to explain endpoint structures or payload formats. Here's an example of what that conversation might look like:
You:
"Write me a Python script that uses Gaffa's browser API to convert a page into Markdown and save the output file locally."
Assistant:
"Here's a Python script that submits a browser request to Gaffa, polls for completion, and saves the Markdown output to a local file..."
gaffa_to_markdown
Usage:
As you can see, it didn't just give you working code but also step-by-step instructions on how to use it.
Step 4: Extend and Customise
From here, you can modify the actions list to use other supported operations such as print, capturescreenshot, or generate_markdown. You can make these changes manually, or simply ask your AI assistant to adapt the script for you. Since it still has the llms.txt context loaded, it can adjust the code to your specific requirements without needing any further explanation.
We're always looking for ways to help developers spend less time wrestling with integration details and more time building what matters. The llms.txt file is one small step in that direction, and we'll keep finding new ways to make working with Gaffa as frictionless as possible. If you have ideas or feedback, we'd love to hear from you.
Ready to build your own integration?
Join hundreds of teams using Gaffa to automate their browser workflows
