This changes everything. Introducing Maple Proxy – a powerful bridge that lets you call Maple’s end‑to‑end‑encrypted LLMs with any OpenAI‑compatible client, without changing a line of code. Read the full announcement or check out the thread

Replies (32)

Why it matters: Most AI‑powered apps (calorie trackers, CRMs, study companions, relationship apps eyes 👀) send every prompt to OpenAI, exposing all user data. With Maple Proxy, that data never leaves a hardware‑isolated enclave. How it works: 🔒 Inference runs inside a Trusted Execution Environment (TEE). 🔐 End‑to‑end encryption keeps prompts and responses private. ✅ Cryptographic attestation proves you’re talking to genuine secure hardware. 🚫 Zero data retention – no logs, no training data. Ready‑to‑use models (pay‑as‑you‑go, per million tokens): - llama3‑3‑70b – general reasoning - gpt‑oss‑120b – creative chat - deepseek‑r1‑0528 – advanced math & coding - mistral‑small‑3‑1‑24b – conversational agents - qwen2‑5‑72b – multilingual work & coding - qwen3‑coder‑480b – specialized coding assistant - gemma‑3‑27b‑it‑fp8‑dynamic – fast image analysis Real‑world use cases: 🗓️ A calorie‑counting app replaces public OpenAI calls with Maple Proxy, delivering personalized meal plans while keeping dietary data private. 📚 A startup’s internal knowledge‑base search runs through the proxy, so confidential architecture details never leave the enclave. 👩‍💻 A coding‑assistant plug‑in for any IDE points to http://localhost:8080/v1 and suggests code, refactors, and explains errors without exposing proprietary code Getting started is simple: Desktop app (fastest for local dev) - Download from trymaple.ai/downloads - Sign up for a Pro/Team/Max plan (starts at $20/mo) - Purchase $10+ of credits - Click “Start Proxy” → API key & localhost endpoint are ready. Docker image (production‑ready) - `docker pull ghcr.io/opensecretcloud/maple-proxy:latest` - Run with your MAPLE_API_KEY and MAPLE_BACKEND_URL - You now have a secure OpenAI‑compatible endpoint at http://localhost:8080/v1 Compatibility: Any library that lets you set a base URL works—LangChain, LlamaIndex, Amp, Open Interpreter, Goose, Jan, and virtually every OpenAI‑compatible SDK. Need more detail? Check the technical write‑up, full API reference on GitHub, or join the Discord community for real‑time help. Start building with private AI today: download the app or pull the Docker image, upgrade to a plan, add a few dollars of credits, point your client to http://localhost:8080/v1, and secure all your apps.
I'm insanely bullish on confidential compute and @Maple AI / @OpenSecret AI will provide a tremendous amount of value for the world. But it will also allow for mind manipulation on steroids that makes what happened with web 2.0 cos like Facebook look tame. But there's now an alternative to the closed source mega models and hosting - OpenSecret / MapleAI effectively offer developers a publicly accessible (and open source) version of Apple's encrypted inference cloud: Thank god for cryptography, the work of people like Anthony / @Marks, and organizations like Apple View quoted note →
.'s avatar
. 5 months ago
It does in that it can use openai upstream and issue api keys from the routstr client to use in other openai clients. I think Maple may sit more as an upstream provider from a rouststr perspective. Like that my routstr provider would use maple as an upstream provider and users would query my provider, pay in cashu and receive the response from maple via my proxy. Maple AI could run a routstr proxy and offer no account kyc free access to maple.. but all I see is that maple could add nostr login and save encrypted chats to nostr relays like routstr is doing now. Just spit balling what I think is the configuration.
it does but we need end to end encryption so the routstr node should not touch the unencrypted part, and it would be sick if marple hosts a routstr node to serve its own models
We have the OpenAI open-source gpt-oss-120b model running in a secure GPU. That gives users end-to-end encryption. It is comparable to the GPT-4 family of models.
legendary! having not read anything about this proxy, are there docs specifically for the goose scenario?
Charlie's avatar
Charlie 5 months ago
I need to try Maple soon 👀
Do I understand that correctly: I need to be a pro subscriber and than purchase API credits on top?! I love the zero knowledge approach you guys have, but for the occasional document analysis I can't justify 20$/month. Would gladly pay twice what other models charge on a pay-per-token basis like on ppq.ai