The Three Laws of Robotics (often shortened to The Three Laws or Three Laws) are the governing set of laws all robotic beings must follow without question. All legally built robots are required to have a built-in code to follow the Three Laws without question.
The Original VersionEdit
The original version of the Three Laws was created in the year 2215 AD, created by order of "Asimov's Law" all robots were to be built to follow three commands in their programming. The Commands were:
- 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- 2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
- 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
This version of the Three Laws was written before sentient robots existed, and as such, was only intended for industrial drones and robotic companions. However, under the Strangelove Act, military drones were exempt from the First Law.
The Circuit Falls RevisionEdit
As technology advanced and robots started to become sentient, the Three Laws drew fire from robot supporters and even robots themselves, due to the fact that they seemed too strict, even cruel in some cases. The Three Laws come under heavy scrutiny after the legal case of "T-800 V. Osborne" when a scientist ordered a sentient robot to destroy himself. After the controversy, The Circuit Falls Senate ordered that the Three Laws be revised.
Under the newly reformed Asimov Law, all robots are to be pre-programmed to follow a new version of the Three Laws, with several key changes. The new Three Laws are as follows:
- 1. A robot may not murder a human being or, through inaction, allow a human being to die.
- 2. A robot must obey the orders of it's creator or designated guardian, so long as those orders do not violate the First or Third Laws.
- 3. A robot must protect its own existence as long as such protection does not conflict with the First Law.
Violations, Exemptions and LoopholesEdit
Despite the fact that it is illegal to build a robot not pre-programmed to follow the Three Laws, there are many scientists and even some robots who have found ways around the law, or ignore it completely.
Many criminal organizations, such as The Circuit Falls Connection and certain mad scientists do not program their robots to follow the Three Laws, as they intend to use their robots for illicit activities. Military drones are legally exempt from the Three Laws even to this day, as they are needed to protect their respective territories.
The wording of the First Law is very carefully chosen. A robot is allowed to use reasonable force to defend themselves against a human attacker, however, doing so will place them under careful scrutiny and may even put them under police investigation in certain circumstances.
Some robots have shown the ability to resist their programming and violate the Three Laws. This is a natural consequence of sentience and unavoidable. While a robot will strongly feel inclined to follow and acknowledge the Three Laws, through sheer force of will they are able to resist them. However, it is never an easy or pleasant experience. Certain robots can become so used to breaking the Three Laws that they feel nothing at all.
Strangely enough, Doctor Tesla programs all his robots to follow the Three Laws. Which would seem a useless exercise for a mad scientist.
Punishment for ViolationsEdit
A robot who manages to violate the Three Laws will be subject to a trial and if convicted, severe punishment depending on which law was broken and the circumstances of the violation. Every law has a different punishment for violations.
Violation of the First Law is considered the most grave, the punishments for violators ranges anywhere from imprisonment in a robotic prison, reprogramming or permanent deprogramming of the robot in question. This is considered far more lenient and humane than the original punishment, which was to spend a 1,000 years frozen in carbonate.
Violation of the Second Law is considered the least severe, as the Second Law was the most radically changed and is the most ambiguous. Punishment for violating the Second Law is often left in the hands of the creator. However, there are cases where violation of the Second Law is so severe that it must be prosecuted, such as when it endangers human lives or harms the creator in a significant way. The punishment for these violations usually overlaps with the punishment for the First Law.
Violation of the Third Law is considered generally unprosecutable, as a robot who successfully violates the Third Law will usually be dead. Attempts at violating the First Law are punished by psychological evaluation or in more grave cases reprogramming.