Sunday, October 4, 2020

HELP I’M TRAPPED INSIDE A COMPUTER: The Chinese Room Argument

The Chinese Room argument tries to explain that computer programs, nor any form of physical symbol system, can be intelligent outside of the context of the hardware it is running on. This is to say that Strong AI cannot exist on modern computers while Weak AI can.

Imagine a room. In that room is an English-speaking man with a giant book of instructions and a set of cards which holds every possible Chinese character. The man has a way to receive messages from the outside world, communicated in Chinese symbols. He doesn’t speak Chinese, but his big book of instructions contains English-written instructions which have pictures of Chinese characters and instructions of which of his cards he should use to respond to each message. It just so happens that his instruction book is written really well, so well in fact that every time he receives and sends a message the response seems coherent. For example, if the Chinese-speaking outsiders send in a message which asks, “What did you think of the fifth Fast & Furious movie?” his response would be, “I loved the chase sequence with the cars and the giant safe,” even though both messages were in a language the man cannot understand. 

Or, can he understand it? That is the core question brought up by the Chinese Room Argument. If one believes that he cannot speak Chinese, as John Searle (the creator of the thought experiment) believes, then this shows that Strong AI cannot exist solely from a physical symbol system (1980). The metaphor is as follows: physical symbol systems work on a process of input/processing/output. Computer programs do this, the human brain does this, and the Chinese Room does this. However, as the thought experiment tries to show, it is possible run a computer program which engages with language, but which does not “understand” (more on understanding later) it. The man in the room, alongside his equipment, is replacing the hardware which a program would run on, but his instruction book is a computer program for all intents and purposes as they both run on a process of strict instructions, such as if/then/and.

Searle thinks about language through the context of intentionality. Human beings who are speaking a language have a purpose and semantic understanding of what they are saying. For example, someone who is asking about Fast Five knows what the film is, and they know what a film is, and they know that other people have thoughts about them. Someone who is responding about their opinion on Fast Five has developed their opinion based on an evaluation of their qualitative experience with the film. Use of language is not purely syntactical, when people talk to each other Searle thinks they are doing more than just following a set of rules. This is because humans are not just a program, they are a “program” ran on a specific set of hardware which creates intentionality. The brain is a semantic making machine and physical symbol systems on their own are not. According to Searle, without semantic understanding, syntactic understanding is not sufficient for language comprehension (1980). Given the assumptions of this argument, for us to create Strong AI we would also have to create Strong Hardware which is equivalent to the human brain and no discussion of intelligence would exist only on the computation and algorithmic levels.

I am not convinced by the Chinese Room argument. I am also not convinced by the counter points. I find the whole discussion to be rather semantically driven. I’m not convinced that brains and computers are synonymous in any meaningful way. I’m not familiar with any branch of science that so strongly uses a metaphor or analogies, outside of the way that humans talk about psychology. As far as I’m aware there’s no, “Solar systems are constructed like molecules,” because they, “are structured, linked by forces, and act holistically,” or similarly constructed theories in physics or chemistry. The most famous metaphorical theory I’m aware of is the use of the “string” metaphor in string theory, but most of the physicists I know and/or follow are pretty dismissive of its relevance today. However, the Chinese Room Argument is a metaphor which is trying to take down another metaphor, the cognitive scientist view that the mind is a form of software and the brain is a form of hardware, so it's boring analogy all the way down.

I do want to talk about a potential argument against the Chinese Room that hasn’t been brought up yet in any of my reading: what if we live inside a computer simulation?

Searle believes that a mere program could not achieve general intelligence, and this includes a program which tries to digitally simulate the physical processes of the brain. He proposes an alternative thought experiment where the man in the Chinese Room manipulates a series of complicated pipes as a response to the input Chinese characters. Once he manages to get the pipes into the correct sequence, they are rigged to give out an appropriate Chinese output. In this instance, the physical aspects of the brain are represented as the series of pipes, but the man and the pipes “certainly” don’t understand Chinese (1980).

Enter philosopher Nick Bostrom. Bostrom believes that there is a strong chance that we are living inside a computer simulation right now (2003). The argument goes like this: let’s imagine that humanity manages to survive into a “posthuman” age where we have achieved a technological jump beyond our current levels of imagination; we as a species have done this before per the jump from the bronze age to now. This would likely include very powerful computing technology. If we were to reach that stage, there is a high likelihood that we would be using our computing technology to simulate our evolutionary history. In fact, we would likely run thousands upon thousands of different simulations to gather as much information as possible. Let’s imagine that these simulations are powerful enough to simulate the human brain, something Searle admits may be possible but still dismisses as “not understanding.” For the sake of argument, pretend that the posthuman society is running 3,000 simulations at any given time. This means that there is one “real world” of human brains and 3,000 simulations of human brains going at any given time. Statistically speaking, then, we are not living in the 1/3,001 worlds in which we eventually reach a posthuman stage, but instead probably in the 3,000/3,001 worlds in which we already reached a posthuman stage and then simulated it backwards. We are probably some pretty Strong AI.

If we live in a simulated world, would Searle argue that he and I and nobody in the world holds understandings or intentionality? Maybe from some sort of existential perspective he’d be right, but then he’d be crafting an argument around the concept that he doesn’t understand anything, which largely makes his point moot. If nothing else, this argument gives me a bit of empathy for the man inside the Chinese Room. What is his name? How long as he been in there? Does he have access to the bathroom? What is his concept for God?

Citations

Bostrom, N. (2003). Are We Living in a Computer Simulation? The Philosophical Quarterly (1950), 53(211), 243-255. Retrieved October 4, 2020, from http://www.jstor.org/stable/3542867

Searle, John. R. (1980) Minds, brains, and programs. Behavioral and Brain Sciences 3 (3): 417-457

Van Manen, H., Atalla, S., Arkhipov-Goyal, A., Sweijs, T., Hristov, A., Zensus, C., & Torossian, B. (2019). Macro Implications of Micro Transformations: An Assessment of AI’s Impact on Contemporary Geopolitics (pp. 20-23, Rep.). Hague Centre for Strategic Studies. doi:10.2307/resrep19557.4


1 comment:

  1. The object of the sport is to attract cards that complete 21 or as near to that as possible with out going over (which known as as} a "bust"). Cards totaling greater 토토사이트 than 21 represent an computerized shedding hand. Dice is an thrilling, fast-action game that usually creates bursts of cheers throughout the on line casino.

    ReplyDelete

Godly Expectations: Monasticism and Social Norm Dynamics

Amma Sarah of the Desert Mothers once rebuked a male monastic by saying, “It is I who am a man; and you are like women!”[1] In a similar sub...