This one is just a quick brain fart, but I noticed myself thinking about it a few times now and thought that I should really write it down.
I starts with a simple question:
If we want our society to follow the laws, why do we make them so hard to understand?
You might say “it’s too complicated to be simple”, but the 10 commandments was okay-ish and they wasted a bunch of them on concepts like “God”. (I’m sure other religions have their versions too - and they are probably equally not perfect.) We can probably do better than a God scribbling on tablets - I have a computer!
My goal here is to boil a set of roles down for society that everybody can agree to.
This is my simplest proposal:
- Don’t be evil.
Ignoring some other problems, it’s quite a low bar to set for a society. We can do slightly better:
- Try to be good, don’t be evil.
Now we’re getting somewhere! We’re not saying “be perfect”, we’re just saying “be good”. It’s a reasonable expectation of people who interact with each other. There is a small problem with this though…
The concepts of good and evil are quite subjective - generally this problem extends all the way from personal perception to entire societies. If societies good agree on good and evil, there probably wouldn’t be so many conflicts. Let’s define these terms:
Good: Increase an actor’s physical agency. Bad: Decrease an actor’s physical agency.
Awesome! To explain further:
To really explore the idea, some examples:
The basic cases appear to be covered, we have most of the commandments covered and we’re only at one rule.
This extends nicely, the more agency you decrease the higher the punishment. There must of course be some threshold where it’s simply not worth ruling upon. E.g. standing in a person’s way whilst walking should likely not be punishable for a minor inconvenience.
The actual punishment itself would scale with the offending actor’s decrease in agency in other(s). If the punishment was none all the way to death, the punishment should be non-linear and tend towards death, but never quite reach it (asymptote).
It must be shown that you purposely took an action to drastically remove the agency of another. Accident shouldn’t be punished, unless it is shown that they were caused by negligence.
The scenario: You are being robbed and you take action to prevent this.
This seems reasonable, so we add a new rule to allow this:
- You may act to increase or maintain your own agency, but not at the cost of another’s agency.
You pull out your gun a shoot them in the head. Hmm, we can make this a little more fair:
- You may act to increase or maintain your own agency, but not at a cost higher than the decrease of another’s agency.
Sounds good. But you push away the robber and they end up losing a tonne of agency. Perhaps they were trying to steal that chocolate bar because they are diabetic. Suddenly you’re in jail because they dropped dead. We should probably guard against this:
- You may act to increase or maintain your own agency, but not at a predicted cost higher than the decrease of another’s agency.
Seems good enough for now.
They touch intimately as they move closer… He grabs her by the arm and immediately goes to jail. Okay, a person needs to be able to willingly forfeit there own agency. There are many scenarios: Perhaps you want to donate a kidney, give somebody a present, or some other self-less act. We need something in place for this:
- You may forfeit your agency only if aware of the consequences of doing so.
Now we need to help people unable to help themselves: children, disabled, old, suicidal, close to death, etc. For some reason they are either completely unable to or partially unable to make decisions for themselves. People need to be able to intervene in their best interest:
- You may decrease the agency of another if it is likely to increase or maintain their agency in the future more than it otherwise would have.
We spoke about stealing, for example, but not for the ability of an actor to actually own something. So we say:
- An actor owns their own body, any item unclaimed and claimed by them and any item they have claimed. Ownership can be transferred to another actor via trade.
Ownership in this case only applies to where it is meaningful. For example, taking ownership of distant planets is not meaningful if you can’t do anything with them. You can’t claim things that do things with them if that would affect the agency of others. Ownership should be considered an extension to the actor itself.
There are still edge cases, some I’ve thought of and some that I am yet to think of. This is just a starting framework. But to be fair, it’s less than 10 commandments and it covers a hell of a lot more. I’ll keep processing it in the back of my mind - who knows what will come of it.
This brain fart was inspired by an article written by Salge et al in 2017 named “Empowerment As Replacement for the Three Laws of Robotics” - well worth a read.