Local AI knowledge assistant

Query your own documents — fully offline.

LokLM keeps your documents encrypted on-device and answers questions through a chat interface — with clickable citations. No cloud, no external AI APIs.

100% offline Local encryption Open source · MIT
Features

What LokLM does

A desktop app for anyone who needs answers with verifiable sources — without handing their data over to a third party.

  • Fully offline

    Models run locally. No account, no telemetry ping, no calls out to external providers.

  • Clickable citations

    Every answer cites your own documents — click straight through to the original passage.

  • PDF, Markdown, Text, Code

    Import documents in the common formats and organise them into workspaces.

  • Encrypted vault

    Argon2id password hashing, AES-GCM per-file encryption, recovery via an 18-word passphrase.

  • Your data stays with you

    Everything lives in a single vault file — easy to back up, easy to migrate.

  • Source-available

    MIT licence. Audit, fork, contribute — the full source is on GitHub.

Download

Download

Latest release. Verify the SHA-256 checksum before installing.

Version

v0.1.2

Released

2026-05-17

Size

103 MB

windows

103 MB

Download for Windows

macos

Coming soon

linux

Coming soon

First install pulls ~20 GB of model files. A stable connection is recommended.

SHA-256 — LokLM-Setup-0.1.2.exe

f18dd8da2dafe129fd7ae715f6bfc77f19d57cc36554bd441a21733d846e85dd

System requirements

Windows 10/11 (x64)

RAM

16 GB RAM recommended

Disk

25 GB free disk space