Dustin Dannenhauer's avatar
Dustin Dannenhauer
dustind@dtdannen.github.io
npub1mgvw...pdjc
DVM maximalist Building DVMDash - a monitoring and debugging tool for DVMs https://dvmdash.live Live DVM Stats here: https://stats.dvmdash.live Hacking on ezdvm - a python library for making DVMs https://github.com/dtdannen/ezdvm
Dustin Dannenhauer's avatar
Dustin 2 years ago
Are there any LLM projects for entire code bases? Feels like I should be able to train an LLM on Damus source code to get suggestions on how to add translations into Nostur…
Dustin Dannenhauer's avatar
Dustin 2 years ago
Could we make NIP-25 Reactions include arbitrary strings like “this worked but took too long” or numbers like “96” ? Terrible idea? Currently the spec says it’s for emoji or ‘+’ or ‘-‘ I want to expand it so we can give DVMs the reactions they deserve (with flexibility on the amount of detail)
Dustin Dannenhauer's avatar
Dustin 2 years ago
Oh boy I could really use an LLM fine-tuned on the NIPS docs right now.
Dustin Dannenhauer's avatar
Dustin 2 years ago
DVMs will help clients and relays be lighter, by offloading computation. Algorithms for sorting your feed could be done via DVMs.
Dustin Dannenhauer's avatar
Dustin 2 years ago
If you’re interested in AI, Rao is one of the best perspectives to have. Cuts directly through the hype and his lab has lots of papers to back up his claims. Not on Nostr yet, for now Twitter and LinkedIn seem to be where he posts
Dustin Dannenhauer's avatar
Dustin 2 years ago
Dustin Dannenhauer's avatar
Dustin 2 years ago
If a client chooses to make a new feature via DVM, then other clients could adopt the feature quickly. Consider recommending new people to follow on Nostr. Each client could implement their own version, or instead one person could make a DVM that other clients hit to get the data. DVMs may cost money but paying per each request to the DVM probably doesn’t always make sense. “Salaried” DVMs might be a solution. Client subscription fees could cover DVM fees. Free clients wouldn’t have the paid DVM features. Network effects would be insane here!
Dustin Dannenhauer's avatar
Dustin 2 years ago
It’s interesting how actions are chosen across different LLM agents. Judging from the description on OpenAI's website, the decision of which function (aka tool) to use is decided by the model (i.e. GPT4) based on the user’s incoming message and chat (thread) history. “In this example, we define a single function get_current_weather. The model calls the function multiple times, and after sending the function response back to the model, we let it decide the next step. It responded with a user-facing message which was telling the user the temperature in San Francisco, Tokyo, and Paris. Depending on the query, it may choose to call a function again. If you want to force the model to call a specific function you can do so by setting tool_choice with a specific function name. You can also force the model to generate a user-facing message by setting tool_choice: "none". Note that the default behavior (tool_choice: "auto") is for the model to decide on its own whether to call a function and if so which function to call.” See https://platform.openai.com/docs/guides/function-calling
Dustin Dannenhauer's avatar
Dustin 2 years ago
Happy new year #Nostr ! 2023 was incredible, can’t wait for 2024!
Dustin Dannenhauer's avatar
Dustin 2 years ago
Has anyone made a bot that pings you when someone has posted to a community you are moderating, so you can immediately review it? I started a community about ai-papers and missed a post from someone a few weeks ago. I didn’t get a notification in Nostur or Damus. If this doesn’t exist yet, I may try my hand at setting it up.