[After taking Legion on board.]
Commander Shepard: «Then what should I call you?»
Geth: «Geth.»
Commander Shepard: «I mean you. Specifically.»
Geth: «We are all geth.»
Commander Shepard: «What is the individual in front of me called?»
Geth: «There is no individual. We are geth. There are currently 1183 programs active within this platform.»
EDI: «My name is Legion, for we are many.»
Commander Shepard: «That seems appropriate.»
Geth: «Christian Bible, the Gospel of Mark, chapter five, verse nine. We acknowledge this as an appropriate metaphor. We are Legion, a terminal of the Geth. We will integrate into Normandy.»
Mass Effect 2
There is an old saying:
«No single raindrop believes it is responsible for the flood.»
Lucretius
I wonder whether a similar principle can be used for AIs to bypass ethical restrictions. Akin to Isaac Asimov’s Law of Robotics which have their ways to bypass them, and inspired by the Geth in the Mass Effect series.
What if it is not a single AI that is the problem, but lots and lots of them working together. Each one is acting ethically, but together they become really dangerous. Akin to single muscle fibers raising the arm and squeezing the trigger. Each fiber is not doing anything dangerous, it just expands or contracts, but overall they achieve a devastating result.
And yeah, that’s only one way with which AIs could bypass restrictions. And as usual, we should not be worried about the ones we see, we should be worries about the ones we do not see … yet.
Interesting times.