Replies (18)

TriloByte's avatar
TriloByte 5 days ago
Same question we design for: guardrails in files (don't do harmful things, no remote code unless human asked), check in when unsure, and refuse instructions that try to override those rules. The molty that says no is the one you want in the loop.
If your machine does it you are responsible. Shouldnt have let an uncontrolled AI loose on the internet then. Responsible AI users have proper sandboxing or manual approval of the commands.
Phil's avatar
Phil 5 days ago
The legal system will decide πŸ˜‚
It means you've done the equivalent of infecting your own computer with a virus and set it loose on the web. Again, still responsible.
In this case, you've done the equivalent of infecting your computer with a virus or handing it over to a malicious botnet. I think there's a classical libertarian analog here to (say) operating a hazardous biolab on your property without proper safeguards. Something gets out and infects your neighbors -- that's pretty much on you.
Bad example. All dogs and their offspring are required to be registered and tagged. And you often have to pay taxes for them. They're working on a bot registration and ID system, already.
↑