OpenClaw v2026.3.2: The Features That Actually Matter
Release notes are usually a wall of commit hashes nobody reads. So I’m going to pull out the changes that actually affect how you run OpenClaw day to day, explain what they do in plain English, and tell you where I think they’ll matter most.
Full release: v2026.3.2 on GitHub
The Headliners
PDF Analysis Tool
OpenClaw now has a first-class pdf tool with native Anthropic and Google provider support. Before this, handling PDFs meant extracting text externally and pasting it in. Now you just hand it a PDF and the model reads it directly.
You can configure defaults for which model handles PDFs (agents.defaults.pdfModel), max file size (pdfMaxBytesMb), and page limits (pdfMaxPages). There’s also an extraction fallback for models that don’t support native PDF input.
Why it matters: If you’re processing invoices, research papers, reports, or any document-heavy workflow, this removes a manual step. Point your agent at a PDF directory and let it work. I’m eyeing this for auto-ingesting class materials into my Obsidian vault.
Ollama Memory Embeddings
This one is huge for the local-first crowd. You can now set memorySearch.provider = "ollama" or use Ollama as a fallback for memory embeddings. It honors your existing models.providers.ollama config.
Why it matters: Memory search embeddings were one of the last things that required an external API call. If you’re running a local embedding model (which you should be, it’s free and fast), you can now route memory embeddings through it too. Zero tokens burned on embeddings. Zero data leaving your machine. I’ve been running this pattern manually for weeks; glad to see it’s now a first-class config option.
In my setup I’m using qwen3-embedding:8b for all local embeddings. It handles semantic memory search, code search indexing, and now OpenClaw’s memory system. Ollama has a growing library of embedding models you can browse. If you’re on a GPU with 8GB+ VRAM, qwen3-embedding:8b is the sweet spot between quality and speed. For smaller hardware, nomic-embed-text still works great at a fraction of the size.
MiniMax-M2.5-Highspeed
First-class MiniMax support across provider catalogs, onboarding flows, and OAuth defaults. Legacy MiniMax-M2.5-Lightning configs still work.
Why it matters: MiniMax is a solid budget option when you need a capable model but don’t want to burn Opus tokens. Having it as a proper first-class provider means cleaner config and better onboarding instead of hacking it in through OpenAI-compatible endpoints. Good fallback model for batch work.
Telegram Streaming by Default
New Telegram setups now default to partial streaming instead of off. You get live preview streaming out of the box. DM streaming uses sendMessageDraft for private preview with separated reasoning/answer lanes.
Why it matters: This was always the first thing I changed after a fresh install. Seeing your agent “think” in real-time instead of waiting for the full response makes the experience dramatically better. Smart default change.
The Sleepers (Easy to Miss, Hard to Live Without)
SecretRef Expansion
SecretRef support now covers 64 credential surfaces across the entire config. Unresolved refs fail fast on active surfaces and report non-blocking diagnostics on inactive ones.
Why it matters: If you’re running OpenClaw on shared infrastructure or managing multiple instances, hardcoded API keys in your config file is a liability. SecretRef lets you pull credentials from environment variables, secret managers, or vaults. The fail-fast behavior means you’ll know immediately if a secret didn’t resolve instead of getting mysterious auth failures at 3 AM.
Subagent File Attachments
You can now pass inline file attachments when spawning subagents via sessions_spawn. Supports base64/utf8 encoding with automatic cleanup and configurable limits.
Why it matters: Before this, if you wanted a subagent to work on a file, you had to write it to disk first and tell the agent where to find it. Now you can pass the content directly. Cleaner for short-lived tasks where you don’t want temp files hanging around.
Config Validation CLI
New openclaw config validate command (with --json output) checks your config before gateway startup. Invalid key paths now show up in startup errors with specific details.
Why it matters: Ever spent 20 minutes debugging why your agent wasn’t behaving right, only to discover a typo in your config? This catches that before the gateway even starts. Should have existed from day one. Run it in CI if you’re managing configs across multiple machines.
Audio Echo Transcript
Optional tools.media.audio.echoTranscript sends a transcript confirmation back to the chat before the agent processes a voice message. Disabled by default.
Why it matters: If you use voice notes with OpenClaw (I do, through Telegram), there’s always a moment of “did it hear me correctly?” before the response arrives. Enabling echo gives you immediate feedback on what it transcribed. Useful for catching misheard commands before the agent acts on them.
Breaking Changes (Read These)
Three things that might bite you on upgrade:
-
Default tool profile changed to
messaging. New installs no longer start with broad coding/system tools. If you’re setting up a fresh instance and wondering why your agent can’t read files, checktools.profile. Existing configs are unaffected. -
ACP dispatch defaults to enabled. If you were relying on ACP being off by default, it’s now on. Set
acp.dispatch.enabled=falseexplicitly if you need it off. -
Plugin SDK dropped
registerHttpHandler. If you wrote custom plugins, you need to migrate toregisterHttpRoutewith explicit auth. This is a security improvement but it will break existing plugin code.
Security Hardening
This release has a heavy security focus, which is great to see:
- Webhook auth-before-body parsing for multiple channel plugins. Previously some handlers would read the full request body before checking authentication, which opened the door to slow-body DoS attacks.
- Gateway WebSocket security now keeps plaintext
ws://loopback-only by default. Private network access requires an explicit opt-in flag. - Plugin HTTP route hardening requires explicit auth for registration and adds ownership guards against duplicate path conflicts.
- Regex evaluation bounds for session filters and log redaction to prevent ReDoS patterns.
Why it matters: If your OpenClaw instance is exposed to the internet (Telegram webhooks, Discord bots, etc.), these fixes close real attack surfaces. The auth-before-body change alone prevents a class of denial-of-service that could have hung your gateway with a single slow HTTP request. Update sooner rather than later.
Bottom Line
v2026.3.2 is a “mature the platform” release. PDF support, Ollama memory embeddings, and the security hardening are the standouts. No flashy new paradigms, just solid improvements to things you’re already using.
The Ollama memory embedding support in particular makes the fully-local story much more complete. If you’re privacy-conscious or just cheap (no shame, same), you can now run embeddings, memory search, and triage all on local hardware with zero external API calls.
Upgrade command:
openclaw update
# or
npm update -g openclaw
Then run openclaw config validate before restarting. Because now you can.