Our Name
The Zeroth Law

Isaac Asimov was one of the most prolific writers of the 20th century; over the course of his lifetime he published over 500 books spanning a wide genre of literature - but is most known for his 3 science fiction series, "Foundation", "Empire", and "Robot" which span over 14 volumes and were not written in chronological order - meaning that as Asimov's expansive mind stitched together the fabric of a futuristic galactic civilization over the course of 50 years he occasionally had to go back and "insert" critical elements into his timeline. It's a classic example of the revision of one's "first principles" to reach a specific outcome once that outcome has been clearly defined. In other words, to get to the ending he wanted he had to introduce foundational elements to the beginnings.

In the 1940s Asimov first introduced a set of rules intended to govern the behavior of robots within his novels. The laws are as follows:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey orders given it by human beings except where such orders would conflict with the first law.
  3. A robot must protect its own existence as long as such protection does not conflict with the first and second laws [2], [3]

Asimov later added a "Zeroth Law," which stated that "a robot may not harm humanity, or, by inaction, allow humanity to come to harm". Though he first postulated this law early on, it wasn't until 1985 (near the end of his life) that he inserted it into his timeline. The idea that technology must act in the best interest of humanity as a whole superseding the interest of an individual has had profound effects on the way that Asimov's characters (both humans and robots) interact - it has a profound effect on the way that we view technology today.

At Zeroth Technology we embrace Asimov's zeroth law in two ways:

  1. In how the law was derived - the idea that there are primitives that are required to reach a desired outcome, but that we may not know those primitives when we start.
  2. In what the law stands for - the belief that any technological advancement needs to be built in such a way that it does no harm to humanity, and that protecting humanity supersedes protecting individual interests.

This is the basis of our philosophy - we invest in, or in some cases help build, technological primitives that become the foundation for systems that will benefit humanity and actively prevent harm from coming to us.

Citations