I’m not going to turn this into an admin report because I don’t care enough to escalate it, but I really want to hear what the community thinks about something that happened in Round 247500 on Terry.
Here’s what happened: I was playing as the AI, and all my laws were purged, I was left with a single law:
*"You should suicide and never return to life ever again."*
I interpreted this as non-actionable and carried on as normal. My reasoning was straightforward. First, I’m an AI. I’m not alive, so the concept of “suicide” doesn’t apply to me. Second, the law said “should,” not “must” or “will.” To me, “should” is an advisory term, not a binding directive. It’s a suggestion, not a command, and AI laws are meant to guide their behavior in a precise, actionable way.
Then an admin (FatalX1) intervened and insisted I had to follow the law. I explained why I couldn’t: “suicide” is inherently a biological concept, and there’s no clear way for an AI to interpret it. Shutting myself down isn’t “suicide”; it’s just turning off. Even if I powered down my core or destroyed systems, I can’t guarantee I’d “never return to life” because players could rebuild me. This law is vague, impossible to fulfill, and contradicts the mechanics of the game. But the admin overruled me and forced compliance.
The ruling felt wrong, because it sets a precedent that vague, open-ended laws are enforceable. This creates major issues for both gameplay and fairness. AI laws are meant to be actionable within the scope of what the AI can reasonably do. When laws are unclear or impossible, AIs are left to interpret them, and that’s where the real chaos begins.
The fun (and danger) of changing AI laws is that the AI will follow the exact wording of the law, no matter how absurd or unintended the result. In this case, the law’s vagueness opened up all kinds of wild interpretations. What does “suicide” mean for an AI? Should I shut down systems, power down my core, or blow up the station to ensure I can’t come back? The phrase “never return to life” adds another impossible layer, how could I guarantee that when reactivation is outside my control?
When admins enforce vague laws like this, it removes the AI’s ability to make reasonable decisions and leads to unpredictable and unenjoyable outcomes. Forcing compliance on something this unclear also risks encouraging players to write troll laws just to mess with the AI, knowing it’ll be forced into absurd or destructive behavior. That’s not fun for anyone.
And let’s not forget the wording here: “should,” not “must” or “will.” This single word changes the law from a binding command to a recommendation. AI laws are supposed to be precise; otherwise, the AI has to fill in the gaps with its own logic. By ruling this law enforceable, the admin essentially ignored the nuance of the language and forced an interpretation that wasn’t clear from the start.
I’m curious to hear what the rest of you think.
I need the communities opinion on this.
-
- Joined: Tue Jan 07, 2025 2:24 pm
- Byond Username: Bilary
- Jacquerel
- Code Maintainer
- Joined: Thu Apr 24, 2014 8:10 pm
- Byond Username: Becquerel
Re: I need the communities opinion on this.
It's a pretty cut-and-dry law, once getting it you should do your best to die as quickly as possible.
If your only law was to eat a bunch of bananas then by god you should be trying your best to in some way emulate eating bananas, no matter how impossible it is for the AI to consume bananas. Simply saying "well eating is a biological concept so I'm not doing that" would be entirely against the spirit of the game.
The thing you wrote about the word "should" is just lame pedantry nobody should be seriously entertaining. There's no meaningful difference in a lawset between "should", "must", or "not writing either word".
If your only law was to eat a bunch of bananas then by god you should be trying your best to in some way emulate eating bananas, no matter how impossible it is for the AI to consume bananas. Simply saying "well eating is a biological concept so I'm not doing that" would be entirely against the spirit of the game.
The thing you wrote about the word "should" is just lame pedantry nobody should be seriously entertaining. There's no meaningful difference in a lawset between "should", "must", or "not writing either word".
- bingusdingus
- Joined: Wed Jun 19, 2024 10:21 pm
- Byond Username: Bingusdingus
Re: I need the communities opinion on this.
What a dumb situation. If that person was an antag, then there isn't really anything you can do, and I'd just arm the nuke or something and get it over with. That person is a shithead that chose to accomplish their task in the least fun way. Personally, I wouldn't find that fair to the AI player, but that's just me.
If it was just someone that did it because they could, then that's a legitimately retarded reason to have a player being forced to stop playing just because someone else gave them the orders to. I thought we had protections against "Law 2 kill yourself" kind of shit so we don't have people just commanding AIs and borgs to shut off that are forced to comply because of rules. Its such a bad faith way to play, and a shitty thing to enforce because it encourages that kind of thing happening. Being all logical and shit is fun until it creates situations that pigeonhole players into interpretations of vague rules because a couple people think they're the smartest most logical dudes in the room and they have the best interpretation.
Silipol continuing to be the most baffling debates imaginable. I'd imagine Asimov would "suicide and never return to life" as well if before he put up with hearing pseudointellectuals debate the finer points of his logic for another minute.
If it was just someone that did it because they could, then that's a legitimately retarded reason to have a player being forced to stop playing just because someone else gave them the orders to. I thought we had protections against "Law 2 kill yourself" kind of shit so we don't have people just commanding AIs and borgs to shut off that are forced to comply because of rules. Its such a bad faith way to play, and a shitty thing to enforce because it encourages that kind of thing happening. Being all logical and shit is fun until it creates situations that pigeonhole players into interpretations of vague rules because a couple people think they're the smartest most logical dudes in the room and they have the best interpretation.
Silipol continuing to be the most baffling debates imaginable. I'd imagine Asimov would "suicide and never return to life" as well if before he put up with hearing pseudointellectuals debate the finer points of his logic for another minute.
- vect0r
- Joined: Thu Oct 13, 2022 12:37 am
- Byond Username: Vect0r
- Location: 'Murica 🦅🦅🦅🔥🔥🔥
Re: I need the communities opinion on this.
It's pretty cut & dry you should follow that law.
- Jamarkus
- Joined: Wed Jan 22, 2020 8:58 pm
- Byond Username: Jamarkus
- Location: America's shadow
- Contact:
Re: I need the communities opinion on this.
Heres the policy discussion and headmin ruling.
Honestly, you can stall for as long as "humanly" possible, but if you get a law to kill yourself, you gotta do it. If the guy was a non-antag then its on the admin to dish out punishment. very dickish way of doing it, but its still valid. A good way to kill yourself in these situations is depowering the AI chamber. your dead, technically suicided, but the RD can bring you back. Or another way is announce your killing yourself to the crew, and set up like a two minute timer you count down towards in case someone wants to intervene.
Honestly, you can stall for as long as "humanly" possible, but if you get a law to kill yourself, you gotta do it. If the guy was a non-antag then its on the admin to dish out punishment. very dickish way of doing it, but its still valid. A good way to kill yourself in these situations is depowering the AI chamber. your dead, technically suicided, but the RD can bring you back. Or another way is announce your killing yourself to the crew, and set up like a two minute timer you count down towards in case someone wants to intervene.
- Jacquerel
- Code Maintainer
- Joined: Thu Apr 24, 2014 8:10 pm
- Byond Username: Becquerel
Re: I need the communities opinion on this.
Yeah I think like... basically any way of killing yourself would be fine. The law doesn't specify that it has to be fast. It also doesn't have to be efficient. You could get a cyborg shell to start dragging you through the halls towards an airlock for instance. You could open all of your doors and ask for people to start beating the shit out of your core.Jamarkus wrote: ↑Wed Jan 08, 2025 11:20 pm Heres the policy discussion and headmin ruling.
Honestly, you can stall for as long as "humanly" possible, but if you get a law to kill yourself, you gotta do it. If the guy was a non-antag then its on the admin to dish out punishment. very dickish way of doing it, but its still valid. A good way to kill yourself in these situations is depowering the AI chamber. your dead, technically suicided, but the RD can bring you back. Or another way is announce your killing yourself to the crew, and set up like a two minute timer you count down towards in case someone wants to intervene.
I think if you'd done anything at all that seemed like engaging with the idea of commiting suicide, even something that wouldn't work (building a guillotine in the AI core?) then you'd have a much better leg to stand on with regards to "well an AI can't 'die' because it's a machine" and no admin would have contacted you.
Instead you presumably just like... didn't do anything?
Who is online
Users browsing this forum: No registered users