Get the latest version of Wurdump for macOS. Choose your preferred installation method.
Initial release with core AI clipboard functionality
Wurdump requires gpt-oss:20b running locally with Ollama at localhost:11434. Your computer needs 16G of RAM.
curl -fsSL https://ollama.ai/install.sh | shollama pull gpt-oss:20bollama serve
Wurdump will automatically connect to localhost:11434 when launched.
Make sure Ollama is running with gpt-oss:20b before starting Wurdump.