The Hard Problem of Consciousness

preview_player
Показать описание
For millennia on end, humans have pondered about the nature of one of the most puzzling aspects of our existence, consciousness. The fact that you can feel, touch, taste, think, and sense things, and have an entirely private world of emotions and sensations, is something that even in the heyday of scientific progress, seems cryptic.
Рекомендации по теме
Комментарии
Автор

Great video - you explained The Hard Problem perfectly in a few minutes and with nice illustrations. In my opinion The Hard Problem cannot ever be solved or answered in a way that will satisfy us. Because consciousness does not have any business in a physical deterministic universe where everything can be explained as mechanisms based on physical laws and illustrated in a book. There is nothing in physics that can even begin to explain subjective experience. So there must be something fundamental about energy and matter that we have totally missed, and even if we could discover what that is, we wouldnt understand it, because we are only really able to comprehend something, if it can be illustrated on a piece of paper.

steenpedersen
Автор

And they ask you, [O Muhammad], about the soul. Say, "The soul is of the affair of my Lord. And mankind have not been given of knowledge except a little." َQuran al-Isra' 85

sooolix
Автор

Of course there is no 'soul' in the brain. Consciousness/awareness ís the soul. And the mind, body and world are appearances in the soul, in consciousness.

Our current (materialistic) model is so contrived, while if we just look at our experience, obviously consciousness is primary, everthing else is secondary. And if you call it consciousness, awareness or soul, it doesn't matter. It's that which sees everything and stays the same eternally. And also, it solves the hard problem of consiousness. It's not that we have consciousness, we are it. You cannot lose consciousness, only consciousness 'of the world'.

Sydebern
Автор

Consciousness is quite simple to understand from Advaita point of view i.e. - Consciousness is not any object which can scientifically measurable, also consciousness according to Advaita is quite different than consciousness concept of Neuroscience .
Like in sleep consciousness is not present if we talk about from Neuroscience perspective .

Now there r 3 points :
1) Consciousness .
2) object
3) Experience

Consciousness is basis of everything .

According to Neuroscience : any experience is consciousness .

According to Advaita : Experience = Consciousness + object .

Example :
Eyes as object (or one type of info.) + Consciousness --> experience of seeing .
That is why in deep sleep no experience happened beacuse there is no point of contact with consciousness .

Basic concepts/Darshana of approx. 10, 000 years old study from Upanishads .

And yes only 1 ( or more correct adavita(means not 2) ) reality exist that called Nirguana brahman or Consciousness .

rahultyagi
Автор

Here follows some of my thoughts on this topic, and I think I might have the answer to this problem, but I have no evidence. I give in the end a suggestion on an experiment that could be done to strengthen my view.

Obs, my English is not the best.

Words and definitions we invent in order to describe different things. I would argue that the current definition on what consciousness is, is a bit half bad, it can cause confusion and it makes it hard to solve The Hard Problem. The definition is not even that easy to do and there is some difference, but usually it is somewhat like this “an individual's ability to use its mental abilities to interpret the outside world and react adequately to it”. You can say it is when you are awake, i.e. you are not sleeping.

According to me the current definition of consciousness involves too many things. That is why it can be good to start with distance oneself a bit from the normal way of thinking about what consciousness is and its normal definition. But I also try to explain what it is most people normally think about when they think of what consciousness is. Because it can still be good to have a definition or description that is pretty close to what we currently have, we want to be able to describe when a human is conscious vs unconscious, or awake vs asleep. It is similar to how we want to differentiate between a little stone and a huge mountain. These could be seen as the same thing materially (both are made of stone), it is just that one is much bigger than the other, and we want to be able to describe both things, therefore we have two words, a stone and a mountain. One could say that the smallest component is a stone, and then you can build a whole mountain with it. I think the current definition of consciousness involves too much, to make the comparison to the stone and mountain, right now consciousness is describing a whole mountain and not the smallest component.

I think that the smallest component of consciousness is: When something has meaning seen from its OWN perspective. Consciousness is created the instant something contains meaning/information, seen from ITS perspective. Whatever the information is, that is what current perspective is conscious about. One can have the perspective, however one like. Can take any lump of anything, big or small, many or few neurons, a stone or many stones. I even think you can take the perspective of a whole country or a whole planet. Although it is probably easier with neurons or electrical pulses. At least when it comes to creating living beings which wants to move around. Because then more meaning can easier be done, than if we had for example stones... What is this stone conscious about seen from ITS perspective? Is there any meaning there at all? I am guessing for the most part nothing. How about a dog? Seen from ITS perspective, what is it conscious about? I am guessing it has a lot of different smells to its consciousness.

The meaning/information can be exactly anything, a feeling or a concept etc. For example, this combination of X number of something (can be anything) means “red”. Then “red” seen from these X number of something is what it is conscious about, red is its inner world, red is what it experience. However, nothing more happens there at all, no time, no feelings, it is not conscious about these things. It is only conscious about red and nothing more, forever. Also the word red has no further meaning seen from its perspective, red has more meaning for us humans, but I am doing a simplified example. This level of consciousness that this X number of something has I would suggest calling “empty consciousness” or something like that, because it is very far from what we usually think of what consciousness is.

The working memory makes it so we humans can hold a certain amount of bits or information active at the same time. This for instance to be able to learn new things. I believe that actually it is the working memory that gets described for the most part when people normally are thinking about what consciousness is. When we say for example, “I was conscious about that sound”. Then it was that you had that sound in your working memory. Consciousness and working memory in this context is pretty much the same thing.

Again, I have no evidence for this, it is just what I think how all this works.

So in my opinion, consciousness is nothing unique. It exists EVERYWHERE and infinitely much! The question is just about WHAT information something is conscious about! We can take any perspective, and it can have any information. What piece of information that is there, is what current perspective is conscious about!
If the information is “red”, it is conscious about “red”.
If the information is “pain”, it is conscious about “pain”.
If the information is “I am walking”, it is conscious about “I am walking”.

When one tries to imagine how something else experience its world, it is important to NOT project YOUR consciousness to it, for example onto a stone, a tree or another human. Computers are definitely conscious in my opinion, but the question is just WHAT do they have in THIER consciousness? It is probably quite different from what we humans get to OUR consciousness. What does the active information mean in ITS perspective? Simplified I am guessing “go right”, “start”, “go towards player” and so on. Again “empty consciousness”. “Go right” might not mean anything to the computer, and that can be hard to imagine as a human. For a human it can be hard to imagine and understand the computers inner world and experience (its qualia).

Also, there is nothing that says that a computer only needs to have just one consciousness. It fully depends on how we place the perspective, what lump of activity we look at (zeroes and ones). Same thing for humans, actually nothing that says that a human only has to have just one consciousness. Although it has probably become through the evolution that we only have one bigger working memory.

There exists one model that describes four components of the working memory, the phonological loop, the episodic buffer, the visuospatial sketch pad and the central executive. So it could be argued that right there we have four consciousnesses. But anyway, in the end it only feels like we have one consciousness in our mind, it is that experience we get. The experience is this way because it is that information we get to our working memory. Probably because it has been winning evolutionary speaking to only feel that you are just one unit, because you only have one body to take care of. That is why the brain has got sculpted throughout evolution so that you only feel that you have one consciousness. It would probably be dumb to feel that you are many individuals, for example “yes, my second me is hearing that sound”, it is better to feel “yes, I am hearing that sound”.

This could be a good test/experiment for scientists to look at. To see if it is possible to find two consciousnesses in one and the same brain at the exact same time. I am guessing that it is easiest done with split-brain patients. Then you could set up an experiment where you try to see if one of the brain halves can succeed in learning and remembering X. And at the same time see if the other half can learn and remember Y. This without that the brain halves gets to know their other halves X or Y. Then you can ask if the right brain half remembers Y, and also if it remembers X. Same thing for left brain half, does it remember X? Does it remember Y? If they only remember what they themselves was supposed to remember, then that would suggest that we had two consciousnesses at the exact same time.

The main problem I see when people try to solve The Hard Problem of Consciousness, is that they do not believe we are a robot, they think there is a fundamental difference between a robot, a human, a stone or a p-zombie. They see consciousness as kinda something magical, mystical, spiritual or unique, that robots, computers or the p-zombie will never have. They are thinking something like this:
“If we only are a robot, then why does experience have to be there? If we are a robot, then experience does not need to be there, the robot can just do what it is programmed to do, no need for a consciousness. Why do we not just do things in the dark?” But then they are missing (in my opinion), that these robots definitely are conscious, it is just right now probably a very different experience than what we humans have. Again, all depends on the meaning, the information something is conscious ABOUT seen from ITS perspective. In my opinion, we ARE a robot, a biological robot! And there is no existential difference! In my view, it is impossible to have the philosophical zombie, because if the p-zombie existed it would be conscious!

I believe that it will be pretty much inevitable in the future to admit that AI (Artificial intelligence) is conscious just as us humans. It will probably happen when the AI is so good or better at speaking and writing as us humans, and when the AI ITSELF tells us that it is conscious and argues for it in a convincing way.

All feedback is welcome. Do you agree, or do you not agree at all?

kaptenr
Автор

Very good video. Here's an observation - a cockroach with less complex nervous system can exhibit low level of consciousness but that's doesn't eliminate the possibility of it having a conscious experience. Just as when we're sick we might experience low levels of sensory information, but nevertheless those low level informations are still experienced and the subjective experience is still there. And one more point, the whole of materialistic reductionist approach is done and perceived in the light of this subjective experience only, think about it, we are conscious first and then only conscious of all the ideas and experiments we do . I think our truer understanding of ourselves is this conscious subject than the objective explanation of being a byproduct of the brain .

Music-eots
Автор

All "why" questions are hard problems.

hidgik
Автор

As a human you don't have consciousness, it's other way around. Consciousness has you, a human experience.

tomaszmielniczek
Автор

Maybe consciousness is simply a measure of awareness, and having a creature be more consciouss is helpful to survival because it lets them study enemies, the environment, their benefits/advantages and plan accordingly, i.e instead of charging into some birds nest for food where you could die from a stealthy lion or desert bird, you could instead notice it's dangerous and decide to look elsewhere. This is so beneficial that with evolution they'd get more and more consciouss to have better abilities to survive

catea
Автор

For some, only humans possess Consciousness. There are others who think that not only humans, but also other animals, or even plants, possess consciousness. I am one of the first, and as such I consider that Consciousness makes us different from other living beings. Certainly, that Consciousness makes us different does not prevent us from recognizing how much we have in common with other living beings.
Explaining what Consciousness is requires explaining what makes us different from humans. Hence, it is not necessary to explain the paradigm of mental states with qualia, a common phenomenon in living beings with a brain.

guillermobrand
Автор

With the exception of thr passage regarding religion at about four-and-a-half minutes, the best short exposition I have seen so far. Thanks.

tomsharp
Автор

A question to Chalmers and his followers. Why should we accept the assumption that there can exist zombies which are physically and functionally identical to us but don't have consciousness? Isn't that what Chalmers wants to prove? If he is wrong, such zombies, in fact, must have consciousness.

aleksandarlikic
Автор

6:20 - 6:30 You conflated two very different things there pal. Awareness and Self-Awareness are not the same thing. You may argue that animals, insects, plants have a form of awareness but it does not necessarily follow that they have a form of self awareness.

For example when you are sleeping if somebody pokes you, your body may respond to it as if being aware of it but you don't necessarily have to be aware that they poked you unless you were awake or the poke woke you up.

reasonablechristianity
Автор

I think we don't have to dissociate feelings with the evolution theory, if we today feel pain, it's because billions of other specimen died billion of years ago because they didn't feel pain, or at least not as the same intensity as us, so the feeling of pain is an (unexpected) result/consequence of evolution.
Now pain is a natural sense some specimen developped, and it's really natural as I said, otherwise as example how can a mom save her child from a road accident if she doesn't feel "fear" ? is it there other "thing" we can think about except the complex concept of "fear" that will force a mom to save her child? I think the answer is simply no.
Talking about fear, fear is a neural generated sensation (imaginated pain) for avoiding eventual pain, we can think about fear as a pre-pain stage.
Perceive feelings as the same as we perceive them is a normal/natural behaviour of every being capable of evolve/mutate, it maybe sound wrong because of the "subjectivity" of the issue, there is always differences of how we all perceive things, not only feelings but visual things also and others, for example we all know how color white is, but the interpretation that our mind provides is the exact same interpretation for others? Biologically we can say no but we are certain that it's significantly close.
So I think how we perceive feelings are relatively the same as long as we have relatively the same anatomy.

zakariahabib
Автор

It's a hard problem because our science is still in an infantile stage when it comes to neuro-biology/chemistry. The feeling of a subjective experience may be illusory, but that doesn't matter all that much. I think that even though we are bad at complex computation, we are extraordinarily good at finding seemingly unrelated patterns and pulling creating subjective theories from them. That seems to be a clear advantage over automatonlike computation systems.

brad
Автор

Your arguments against dualism weren't very convincing. You were supposed to bring up the interaction problem. The distributed nature of the information processing in the brain has nothing to do with whether or not there is a soul.

Also, if you are going to bring up consciousness possibly being "granted" to us by evolution, you should bring up the question of "how?". Just saying that evolution gives it to us just pushed the hard problem back a level.

stucrab
Автор

Soul is first person experience, noone knows how to describe it objectively . This could be reason why....

tomazflegar
Автор

We cannot actually know that it is possible for mindless automata to do what e.g. humans do without conscious experience.. In the example it may be enticing to take away some functionality to get that simple programming we might see in a computer but that would make the comparison lopsided... It may still be proportionate and inextricably linked: the functionality and experience.
If you ask me... The real Problem is that we don't truly know what the Hard Problem wants to know, its more of an exclamation of wonder and mystery and maybe of computational irreducibility or Gödels incompleteness theorem. How can we be asked to answer the question if it is itself based on assumptions that aren't proven? I would say there is no actual problem.. Consciousness is emergence that is inevitable.

jopmens
Автор

Its not like there has to be a reason, just by chance there is. It assumes there is free will? We just happen to clam our experience is some special thing, but if we had the ability be could make an AI think just like us and it would be just a valid.

calebgrasse
Автор

I wonder if this is a solid argument (edit nevermind lol):
if you make a robot, no matter what what it does and perceives is objective - you can determine what it will do because you know the environment (inputs), and code (the robots outputs). Therefore a robot can never be concious for it has no free will and can't think freely

This logic can be applied to a human too - apart from code, there's neurons. Nothing else changes. So in this idea humans too are not conscious
Which then would mean it is as inhumane to slaughter a human, a cow, a dog, or a fly because they are all at the same level of consciousness and in a sense, intelligence. (Though morally this certainly feels wrong, a human live is much higher valued over an ants. though this could just be evolution/preservation speaking and not actually any sort of consciousness)

The only chance stuff can actually be conscious, meaning to think freely and if copied in a exact same universe, be able to have different reactions to the same action - is if there's some weird quantum teleporting mumbo jumbo going on


Yeah woah that's a weird idea. Maybe that's true and humans are too pussy to admit it, or maybe it can't be the answer because people collectively think so differently on the subject. Maybe like teleportation questions, any questions of consciousness are null and not answerable... because the concept is impossible.

catea