Code AI with codebase context
Cody is an AI coding assistant that uses the latest LLMs and codebase context to help you understand, write, and fix code faster.
⭐ Install Cody from the VS Code Marketplace or the JetBrains Marketplace, then check out the demos to see what you can do.
— or —
- Build and run the VS Code extension locally:
pnpm install && cd vscode && pnpm run dev
- See all supported editors
Cody is an open-source AI coding assistant that helps you understand, write, and fix code faster. It uses advanced search to pull context from both local and remote codebases so that you can use context about APIs, symbols, and usage patterns from across your codebase at any scale, all from within your IDE. Cody works with the newest and best large language models, including Claude 3.5 Sonnet and GPT-4o.
Cody is available for VS Code, JetBrains, and on the web.
See cody.dev for more info.
- Chat: Ask Cody questions about your codebase. Cody will use semantic search to retrieve files from your codebase and use context from those files to answer your questions. You can @-mention files to target specific context, and you can also add remote repositories as context on Cody Enterprise.
- Autocomplete: Cody makes single-line and multi-line suggestions as you type, speeding up your coding and shortcutting the need for you to hunt down function and variable names as you type.
- Inline Edit: Ask Cody to fix or refactor code from anywhere in a file.
- Prompts: Cody has quick, customizable prompts for common actions. Simply highlight a code snippet and run a prompt, like “Document code,” “Explain code,” or “Generate Unit Tests.”
- Swappable LLMs: Support for Anthropic Claude 3.5 Sonnet, OpenAI GPT-4o, Mixtral, Gemini 1.5, and more.
- Free LLM usage included Cody Free gives you access to Anthropic Claude 3.5 Sonnet and other models. It's available for individual devs on both personal and work code, subject to reasonable per-user rate limits (more info).
Cody comes with a variety of AI-for-coding features, such as autocomplete, chat, Smart Apply, generating unit tests, and more.
Here's an example of how you can combine some of these features to use Cody to work on a large codebase.
https://www.loom.com/share/ae710891c9044069a9017ee98ce657c5
All code in this repository is open source (Apache 2).
Quickstart: pnpm install && pnpm build && cd vscode && pnpm run dev
to run a local build of the Cody VS Code extension.
See development docs for more.
Cody is often magical and sometimes frustratingly wrong. Cody's goal is to be powerful and accurate. You can help:
- Use the 👍/👎 buttons in the chat sidebar to give feedback.
- File an issue (or submit a PR!) when you see problems.
- Community forum
- Discord
Individual usage of Cody currently requires a (free) Sourcegraph.com account because we need to prevent abuse of the free Anthropic/OpenAI LLM usage. We're working on supporting more swappable LLM options (including using your own Anthropic/OpenAI account or a self-hosted LLM) to make it possible to use Cody without any required third-party dependencies.
You can use Cody Free or Cody Pro when Codying on your work code. If that doesn't meet your needs (because you need a dedicated/single-tenant instance, audit logs, bring-your-own-model etc.), upgrade to Cody Enterprise.
The Cody editor extensions work with:
- Sourcegraph Cloud
- Sourcegraph Enterprise Server (self-hosted) instances on version 5.1 or later