I built ClipAI CLI β a local AI text assistant that lives entirely in your terminal and runs on Ollama. No API key, no subscription, no data leaving your machine.
The core idea: you should be able to grab text from anywhere β clipboard, file, stdin β pass it through an AI operation, and get the result back without ever leaving the terminal.
Three modes
1. Interactive TUI (default)
Just run clipai with no arguments. You get a menu to pick your input source, choose an operation, see the result, and optionally copy it back to clipboard.
2. Clipboard mode β fastest daily workflow
Copy text in any app (Cmd+C), then:
clipai clip -o validate # fix grammar β result back to clipboard
clipai clip -o summarize # summarize β result back to clipboard
clipai clip -o formal # rewrite β result back to clipboard
Then just Cmd+V wherever you want the result. This is the mode I use most β it slots into any app without context switching.
3. Pipe mode β composable shell pipelines
Reads from stdin, writes to stdout. Fully pipeable:
# Fix grammar in a file
cat draft.txt | clipai pipe -o validate > fixed.txt
# Chain operations
cat email.txt | clipai pipe -o shorten | clipai pipe -o formal
# Also copy to clipboard while printing to stdout
cat notes.txt | clipai pipe -o summarize --clipboard
Operations
| Name | What it does |
|---|---|
validate | Fix grammar, spelling, clarity |
summarize | 2-3 sentence summary |
formal | Rewrite in professional tone |
casual | Rewrite in friendly tone |
shorten | Shorter, same message |
translate | Translate to English |
Add your own by editing clipai/ai.py:
OPERATIONS: dict[str, str] = {
...
"mytask": "Your prompt here. Return only the result.",
}
It automatically appears in all three modes β no other changes needed.
macOS right-click Services
Thereβs also an installer for macOS Services so you can run any operation directly from the right-click context menu in any app:
python install_services.py
Enable the AI: * entries under System Settings β Keyboard β Keyboard Shortcuts β Services β Text.
Install
Requires Ollama running locally with a pulled model:
brew install ollama
ollama pull llama3.2
ollama serve
Then:
pip install clipai-cli
Or clone and run with Poetry:
git clone https://github.com/chanukyapekala/clipai
cd clipai
poetry install
clipai
Source: github.com/chanukyapekala/clipai