A Zombie, A Prisoner, and a Robot Enter into a Social Contract

In this episode, we discuss social contract theory and how it is explored in The Walking Dead, Lost in Space, and Orange is the New Black.  What would a state of nature–a state before civil society–look like?  What would people be like in such a state?  What would their motivations be?  Would characters like Rick Grimes, Negan, and Dr. Smith naturally emerge?

Our first episode just went under TEN THOUSAND DOWNLOADS! Let’s keep it going!

Here is the music we used in this episode:

“Enter the Maze” Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 3.0 License
http://creativecommons.org/licenses/by/3.0/

“Teddy Bear Waltz” Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 3.0 License
http://creativecommons.org/licenses/by/3.0/

“Graveyard Shift” Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 3.0 License
http://creativecommons.org/licenses/by/3.0/

“Village Consort” Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 3.0 License
http://creativecommons.org/licenses/by/3.0/

“Secret of Tiki Island” Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 3.0 License
http://creativecommons.org/licenses/by/3.0/

“Loopster” Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 3.0 License
http://creativecommons.org/licenses/by/3.0/

“Attack of the Mole Men” Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 3.0 License
http://creativecommons.org/licenses/by/3.0/

6 Comments

  1. This was great y’all. It brought to mind a problem that I have always had with the notion of “the state of nature.” It seems like the state of nature is intended to represent a scenario where all social structures, institutions, etc. are either stripped away or not yet in existence. However, in practice, it feels like people use the state of nature to mean the absence of macro-level social structures (e.g., governments, institutions, formal organizations) and EXPLICIT, codified social contracts. I can imagine a scenario where that could happen (any post-apocalyptic scenario would suffice), but I can’t imagine a scenario where microstructures are stripped away. As is evident in the Walking Dead, when macrostructures topple, microstructures (generally, some form of tribalism) quickly emerge to fill in those gaps. This is why I’ve just always thought that Hobbes was just dead wrong. He uses the state of nature as a thought experiment to uncover “human nature,” but he assumes away a fundamental aspect of human nature: we are natural cooperators. Language, our ability to utilize symbols to represent objects in the social world, is fundamentally about coordinating with other humans. We are psychologically hardwired to feel empathy for other humans. These thought experiments only stand up to scrutiny if we assume that humans are *merely* rational and self-interested, and ignore all of the other (prosocial) features that define human animals.

    [Disclaimer: it has been a while since I’ve read or thought about Hobbes, Locke, or SC theory, so correct me if I am misrepresenting them.]

  2. Okay, I got a philosophical scenario for you pertaining to AI. In the show Black Mirror, season 2 episode 4, there is an advanced AI assistant which is not seen as conscious but to the viewer it is pretty plainly seen that the AI in fact does have consciousness even though it has no real form besides code and silicon. This idea is also brought up in Westworld season 2 when trying to copy humans. Thousands of trials are run to see what makes someone tick in order to create a perfect replica. Do you think AI as a program should be protected as a life form? And if so, how can we help those that would see them as just another computer program to view the AI as a different kind of conscious being?

Comments are closed.