**Source:** GitHub (localgpt-app/localgpt)

A new open source project called **LocalGPT** positions itself as a *local device-focused* AI assistant built in Rust. The project emphasizes privacy (keeping memory data on-device) and a systems-style design that includes persistent memory plus scheduled background work.

## Highlights

- **Single-binary install** via `cargo install localgpt` (no Node/Docker/Python required)

- **Persistent memory** stored as markdown files and indexed with SQLite FTS5; semantic search via `sqlite-vec`

- **Autonomous heartbeat** for background tasks in daemon mode

- Interfaces: **CLI, web UI, desktop GUI**

- LLM providers: Anthropic, OpenAI, Ollama

- Workspace layout compatible with OpenClaw-style files: `SOUL.md`, `MEMORY.md`, `HEARTBEAT.md`

## Why it matters (AI + Developer Tools)

The “local-first agent” pattern is gaining momentum as teams look for assistants that are easier to deploy, easier to audit, and less dependent on always-on cloud backends.

**Link:** https://github.com/localgpt-app/localgpt