I’m serving several local (LMStudio) and remote (@Maple AI) models to OpenWebUI, where MCP servers can be added easily and pipelines can be used to further enhance. Serve it to yourself locally, via ZeroTier/tailscale/etc, or use sish/boringproxy/etc to expose it to the web. There are a few native apps that connect to OpenWebUI.
Still working my way over to CrewAI but it looks interesting.
Login to reply