Searle: Philosophy of Mind, lecture 6

preview_player
Показать описание
John Searle

Philosophy of Mind, lecture 6

UC-Berkeley Philosophy 132, Spring 2011

MP3s of the entire course:

Рекомендации по теме
Комментарии
Автор

I am not sure that pain argument is a strong argument against materialism

mohamedmilad
Автор

"calculators don't calculate... the calculation is something we do with the calculator". Is that right?

homphysiology
Автор

This one is about the Chinese Room Argument.

Discussion on the Chinese Room Argument begins at 22:20 .

ameya_
Автор

I'm always looking for new interesting lectures on Psychology/Philosophy, please let me know if you guys have any recommendations, would be highly appreciated

davidfost
Автор

If you have cord section interrupt c nerve signals less likely to experience pain. Pain requires terminal end and emotional part of the brain gets collateral signals

mohamedmilad
Автор

can somebody say the name of the logical theorem he was referring to at 1:08:15

nejiknya
Автор

"There is something else going on." Correction: there seems to be something else going on. Perhaps all that there is is syntax and it just seems like we deal with meaning, just like in the Chinese room.

josuepineiro
Автор

Neah... you don't understand. Chinese room is about the fact that you cannot get semantics out of syntax. Brains are working based on semantics. And you cannot get this from computation or the laws of physics that we know. There is something else going on.

ROForeverMan
Автор

Great comment very helpful thank you!

Rocky_
Автор

Searle's reductionist argument can also be adapted to prove that brains don't think. All a brain does is move molecules around and fire electrical signals. Molecules don't have semantics or intentionality. Electrical signals don't have semantics or intentionality. Semantics and intentionality are necessary for conciousness and understanding. Therefore brains don't have conciousness and understanding, any more than the Chinese Room does.

VoyagesDuSpectateur
Автор

Ha - well the computer says it's conscious, so why would it lie??

MrPatrickDayKennedy
Автор

Wow! Searle seems to think that saying something defies common sense is a reasonable objection to an argument. I suppose quantum physics is thus refuted? Never mind the room. The system comprising you and the book understands Chinese, and is conscious just as much as any one is, which is almost certainly not at all.

myAutoGen
Автор

As of 2013, if you spend a couple thousand bucks, the computer performs a couple of trillions of operations per second (in short scale), rather than a couple of millions.

LukaszStafiniak
Автор

Ok here is a problem with the Chinese Room experiment: Let's say the computer is asked, "How do you feel?". Note that the response to this question can depend on a great many things. For example, maybe the computer determines it is sad because there was a power outage the night before, or maybe it lost a chess match to a human. In principle, there could be any number of reasons that cause the computer to respond that it is sad. Now... according to Searle, he has the English version of the computer program which he runs manually. Consider what happens while he is carrying out all the computations necessary to answer this question. He too learns there was a power outage the night before. He too learns that the computer lost a chess match to a human. In short, he too learns exactly why and how the computer responds that it feels sad. The only thing he doesn't bother to learn is how to translate the answer into Chinese, though he could even do that if he wanted to. But recall, the Chinese questioners agree that the computer passes the Turing Test. Thus, Searle must also agree since there is nothing special about translating the English answer into Chinese. The "mind" in this experiment is not the shuffling of symbols, but all the factors that go into giving the answer to the question. If the processing of those decisions is sufficiently complex, then we must agree the computer has a mind, regardless of the language it uses to communicate its intention! Please refute that.

johnnavarra
Автор

He continually attempts to refute many theories of artificial consciousness and yet never actually lays down his own theory. What the Chinese room thought experiment actually states is the problem of other minds to which there isn't an actual refutation.

robotaholic