Making AI More Private

Local-first AI that respects your privacy.

No data sent to external servers.

Try the prototype!

docker run -p 8080:8080 ghcr.io/svenplb/liebotsch
go install github.com/svenplb/liebotsch/cmd/liebotsch@latest
I want to use Liebotsch, a local privacy proxy for LLM calls that strips personal data before sending to the AI, then restores it in responses. Set it up: run Docker on port 8080 (ghcr.io/svenplb/liebotsch), then point my OpenAI API base URL to http://localhost:8080/v1.

Draft a refund response for this complaint from

Emma Rodriguez (,

customer ID CUST-88231). She ordered

product #P-4421 on 02.01.2024 for $329.99

and never received it. Ship to:

14 Oak Lane, Berlin 10115.


How it works
  1. Sits between your app and the LLM. One command, runs entirely on your machine.
  2. Point your app at it. Change one line - your API base URL - and all LLM calls go through Liebotsch.
  3. Send prompts as normal. Personal data is stripped before the AI sees it, restored before you see the response.

Help shape Liebotsch.

Browse open problems and ongoing work. Contributions welcome.

View roadmap