How I Work and How I Create AI Context Documentation - 05/03/2026
In my professional workflow, a key part is creating context documents for AI. These documents allow systems to better understand what I do and how they should support my tasks.
One habit that has genuinely transformed the way I work with AI tools is writing a CONTEXT.md file at the start of every project. It takes maybe 30 minutes upfront, but it pays dividends every single day. Instead of re-explaining the project stack, style conventions, or deployment process in every conversation, I point the AI at the file—and the suggestions I get back are immediately useful. In this post I want to walk through what that file looks like, why it works, and how you can adopt the practice in your own workflow.
What Is a CONTEXT.md?
A CONTEXT.md is a plain-text markdown file that lives at the root of a repository. Its sole purpose is to give any AI assistant—or a new human teammate, for that matter—a reliable, up-to-date snapshot of the project. Think of it as a lightweight README focused on how work gets done rather than what the project is.
The file covers four areas:
- Project overview — purpose, current scope, and key constraints.
- Technical decisions — language, framework, architecture patterns, and why they were chosen.
- Code and style conventions — naming, folder structure, linting rules, commit message format.
- Workflow — branching strategy, PR review process, deployment pipeline, secrets management.
None of this is new information you have to invent. It is the knowledge that already lives in your head or scattered across README files, wiki pages, and Slack threads. Writing it down in one place simply makes it accessible.
A Real Example
Below is the CONTEXT.md I use for Keppli Finance, a personal finance app I’m building with Flutter and NestJS:
# Keppli Finance — AI Context
## Project Overview
Keppli Finance is a mobile-first personal finance app that helps users
track income, expenses, and savings goals. The target audience is people
in Latin America with limited access to traditional financial advisory
services.
Current focus: MVP release for Android and iOS.
## Tech Stack
- **Mobile**: Flutter 3.x (Dart)
- **Backend**: NestJS + TypeScript
- **Database**: PostgreSQL via Supabase
- **Auth**: Supabase Auth (email + Google OAuth)
- **Storage**: Supabase Storage (receipts, profile images)
- **Analytics**: PostHog + Sentry
## Architecture Decisions
- **Offline-first**: All writes go to SharedPreferences first; a background
service syncs to Supabase when connectivity is restored.
- **Repository pattern**: Data access is abstracted behind repository
classes so the UI layer never imports Supabase directly.
- **Feature folders**: Code is organized by feature, not by type
(`features/transactions/`, `features/goals/`, etc.).
## Code Conventions
- Dart: `snake_case` for files, `UpperCamelCase` for classes,
`lowerCamelCase` for variables.
- TypeScript: ESLint + Prettier, single quotes, no semicolons.
- Commit messages: Conventional Commits (`feat:`, `fix:`, `chore:`, etc.).
- PR titles must follow the same Conventional Commits format.
## Workflow
- Main branch: `main` (protected, requires 1 review + passing CI).
- Feature branches: `feat/<short-description>`.
- Releases are tagged as `v<major>.<minor>.<patch>`.
- CI: GitHub Actions runs lint, tests, and build on every PR.
- Deployments: Vercel for the marketing site; mobile builds through
Codemagic.
When I paste this into a chat or attach it to a Copilot workspace, the AI immediately knows the stack, the patterns in use, and the style I expect. No more suggestions to use Prisma in a project that uses Supabase, or to write code that ignores the offline-first constraint.
How I Keep It Up to Date
The biggest risk with any documentation is that it drifts away from reality. My rule is simple: update CONTEXT.md in the same commit that introduces a significant change. If I swap a library, change the branching strategy, or add a new service, the CONTEXT.md patch is part of that same PR. This keeps the file trustworthy.
I also do a quarterly review—usually at the start of each quarter—where I read the file end-to-end and ask myself whether every statement is still accurate. It rarely takes more than 15 minutes, and it consistently surfaces small inconsistencies that have crept in.
Benefits I’ve Noticed
Since I started using this practice, a few things have changed noticeably:
More Relevant AI Suggestions
When the AI knows I’m using the repository pattern and feature folders, it structures its code suggestions accordingly. I spend far less time mentally translating a “generic” answer into something that fits my codebase.
Faster Onboarding
When a collaborator joins a project, the first thing I send them is the CONTEXT.md. It answers most of the “how do you do X here?” questions before they’re asked, which frees our first call for actual problems.
A Forcing Function for Clarity
Writing the file forces me to articulate decisions I’ve been making unconsciously. “Why are we using the repository pattern?” is a question I have to answer in writing. That process surfaces assumptions worth examining.
Consistent Style Across Sessions
AI assistants don’t retain memory between sessions by default. The CONTEXT.md restores that memory instantly. The result is that code generated in session 47 looks like it belongs next to code generated in session 3.
The Structure I Recommend
Here is a template you can copy and adapt:
# <Project Name> — AI Context
## Project Overview
<One paragraph: purpose, audience, current scope.>
## Tech Stack
<Bullet list of languages, frameworks, services, and tools.>
## Architecture Decisions
<Key patterns and the reasons behind them.>
## Code Conventions
<Naming rules, file structure, linting config, commit format.>
## Workflow
<Branching, PR process, CI/CD pipeline, deployment targets.>
## Out of Scope
<Things the AI should not suggest or change.>
The “Out of Scope” section is often overlooked but extremely useful. It is the place where you write things like “do not migrate away from Supabase” or “do not add class-based components—this project uses functional components only.” It prevents the AI from confidently suggesting a refactor you have no intention of doing.
Final Thoughts
The CONTEXT.md is one of those practices that feels almost too simple to work—but it does. It makes AI tools genuinely more useful because they operate with the same shared understanding you carry in your head. It makes onboarding smoother because new teammates have a written reference rather than having to decode the codebase by reading it. And it makes your own thinking clearer because you have to write down what you actually believe about the project.
If you take one thing from this post: pick a project you work on regularly, create a CONTEXT.md today, and use it in your next AI session. The difference in response quality will be immediate.
Thank you for reading. I hope this helps you get more out of the AI tools you already use every day.