If you’ve ever felt daunted by the idea of writing Vim plugins, you’re not alone. Recently, I decided to create a Vim plugin that integrates with Ollama, leveraging the Llama 3.1 model. This allows me to send text and/or prompts to Llama without having to leave my vim window. What’s surprising is that it’s been years since I’ve done anything remotely complex in VimScript, so my skills are a bit rusty. No matter, leveraging ChatGPT and Llama 3.1, I was able to create the plugin from start to finish in about an hour!
In this post, I’ll explain how to use the plugin, how it works, how to install it, and provide a couple of examples to showcase its capabilities.
What Does the Plugin Do?
This plugin allows you to send selected text (or the entire buffer) to Ollama’s Llama 3.1 model. The results are displayed in a separate “Command-Result” (CR) window within Vim. It’s like having Llama right inside your editor, ready to process your text or answer questions.
How to Use the Plugin
Once installed, the plugin provides two main functionalities:
- Send selected text or the whole buffer to Llama
– Mapping:<Leader>l
– Action: Sends the text to Llama and displays the output in the CR window. - Send text along with a question
– Mapping:<Leader>L
(note the capital ‘L’)
– Action: Prompts you to input a question, sends the text and the question to Llama, and displays the output.
Key Features:
- Works in visual mode (highlight specific text) or normal mode (process the entire buffer).
- Handles escape sequences to ensure compatibility with shell commands.
- Formats and displays results in a Markdown-friendly CR window.
Examples
Example 1: Summarize a Block of Text
- Highlight a block of text in visual mode.
- Press
<Leader>l
. - The plugin sends the text to Llama for processing, and the summary appears in the CR window.
Example 2: Ask a Question About the Code
- Press
<Leader>L
while in normal or visual mode. - Input your question, like: “Why does this function always return null?”
- Llama analyzes the code and your question and returns an answer.
How to Install It
Step 1: Install Ollama and Llama 3.1
Visit the Ollama web site and download/install Ollama. Once you have Ollama installed, open a terminal window and pull down a copy of Llama 3.1
$ ollama pull llama3.1
Step 2: Copy the Plugin Code
Grab the code at [this gist]. Save it into the file ~/.vim/plugin/ollama.vim
.
Step 3: Restart Vim
Restart Vim, and the plugin will be ready to use.
How It Works
Here’s a breakdown of how the plugin operates:
- Text Capture:
– Captures selected text in visual mode or the entire buffer in normal mode.
– Escapes single quotes to ensure compatibility with shell commands. - Ollama Command:
– Constructs a shell command using the captured text and optional user-provided question.
– Sends the text to Ollama using theollama run llama3.1
command. - Output Processing:
– Cleans and formats the output usingtr
andsed
.
– Displays the result in a Markdown-friendly CR window in Vim. - Keybindings:
–<Leader>l
: Sends the text without a question.
–<Leader>L
: Prompts the user for a question before sending the text. - Status Messages:
– Displays a temporary “Working…” message while processing.
– Restores the default status line once the output is ready.
Why This Plugin is Useful
This plugin is a quick and effective way to integrate Llama 3.1 into your Vim workflow. Whether you’re summarizing text, analyzing code, or generating insights, having Llama at your fingertips enhances productivity and creativity.
Conclusion
Writing a Vim plugin might seem daunting, but tools like Llama 3.1 and ChatGPT make it remarkably accessible—even for beginners. With just an hour of effort, this plugin is now a functional and valuable addition to my Vim toolkit.
If you’ve been thinking about automating workflows in Vim, why not give this plugin a try? Let me know how it works for you or if you make any interesting modifications!