⚡💻 TECH - Anthropic: We just released Claude Code channels, which allows you to control your Claude Code session through select MCPs, starting with Telegram and Discord.
Use this to message Claude Code directly from your phone.
Login to reply
Replies (10)
Watch out, OpenClaw… Anthropic is coming for your free lunch:
View quoted note →
OpenAI hires the guy that created OpenClaw! Anthropic integrates OpenClaw-kind tools in Claude before OpenAI! Crazy!🤪
Oh wow, it's all happening too fast!
I was wondering if I actually should use my Claw for doing coding at all, or if I was better off interfacing directly with Claude Code or Codex. I haven't used either very much yet, but suspect adding the Claw layer to do any significant development would prove to be a mistake, and I'd be better off doing it directly with their tools, and only once something is built handing it over to my Claw (possibly for scheduling automated runs or something). I should probably focus on nearer term stuff for now, as I'm not there yet anyway.
Thanks for sharing!
I hit a wall because to mac permission with claw trying to code something. I tried it with Goose and had some initial success but not had time to take it further
Still Claude Code cannot execute anything, right ? It can only deliver code.
Am curious, what model or model-provider (is that the right term, or does saying this reveal I'm a newb?) do you use? And do you do API or OAuth?
I'm doing OpenAI, and have only tried a couple models so far. I created an API key instead of doing a OAuth thing to the monthly subscription, resulting in it costing me money as I go, but I was under impression it was the better route to go for Claws for some reason. I'm looking now into creating an OpenRouter account/key and maybe using that, as it would give me access to all the models...and even though I don't know why I will need that, I figure more options is better. Plus it gives me something to do.
Forgive any newb-speak in the above, I'm really trying....
It’s all good. I’m no expert. Only claw a couple weeks ago and spent half the time trying to fix it! I’m using ollam pro and their open weight cloud models. They are the Chinese models but I’m not too bothered by tha. Some ppl are. Minimax 2.5 I think. For what I need don’t come anywhere near running out of tokens. They’ve announced some better integration at point of config.