Yes, it's pretty much the same one thing said twice.anonymous66 wrote: ↑Sat Sep 28, 2024 5:13 pmAre you suggesting that mental states are identical to brain states?Atla wrote: ↑Sat Sep 28, 2024 5:11 pmMental and physical are one and the same thing, we just have developed a weird cognitive double vision over the centuries. There is no actual evidence.anonymous66 wrote: ↑Sat Sep 28, 2024 5:02 pm
Okay - property dualism. If property dualism is the case, then mental properties exist. Qualia (thoughts, feelings, sensations, etc) have no physical properties - they are mental properties (the first person experiences involved in qualia) - therefore mental properties exist - therefore there is evidence for property dualism.
Theories of Consciousness
Re: Theories of Consciousness
-
Flannel Jesus
- Posts: 4302
- Joined: Mon Mar 28, 2022 7:09 pm
Re: Theories of Consciousness
Yes, anything that's not fundamental must be emergent. So minds must also be fundamental or emergent. Some materialists might think minds might be fundamental (which may disqualify them from being materialists, but who am I to say?), some may think minds are illusory (odd hypothesis, as it kind of presupposes a mind to experience an illusion, to have the illusion of a mind no?), and the rest are going to think it's emergent. Probably the majority, but I haven't run a survey.anonymous66 wrote: ↑Sat Sep 28, 2024 5:06 pmAre you alluding to emergence? The idea that neural processes arise from a physical brain?Flannel Jesus wrote: ↑Thu Sep 26, 2024 6:41 pmThoughts can be the consequence of material processes, rather than just "material things". A process doesn't have to have a weight right?anonymous66 wrote: ↑Thu Sep 26, 2024 6:28 pm
This appears to be a problem for materialism. I understand materialism to be saying that everything can be defined in terms of its physical properties. If so, then what is the mass of a thought? How much space does it take up?
-
anonymous66
- Posts: 56
- Joined: Fri Jan 12, 2018 9:08 pm
Re: Theories of Consciousness
The author of Ultimate Questions in Philosophy Second Edition can explain the problem better than I can...Atla wrote: ↑Sat Sep 28, 2024 5:16 pmYes, it's pretty much the same one thing said twice.anonymous66 wrote: ↑Sat Sep 28, 2024 5:13 pmAre you suggesting that mental states are identical to brain states?
"In spite of this attractiveness, the identity theory is open to objections. One crucial objection is well illustrated by the philosopher John Searle. Searle writes
Searle's thought experiment points to a crucial weakness of the identity theory. The identity theory identifies mental states with brain states. It is thus committed to the general claim: Wherever there is no brain, there is no mind! But is this claim really plausible? The thought experiment clearly shows that it is logically possible that a being whose brain has been replaced by silicon chips has a mind. This shows that a strong version of the identity theory (sometimes called "type-type identity theory"), according to which mental states and brain states are necessarily identical, is false. The identity theory leads to a certain kind of chauvinism by insisting that beings without the appropriate "hardware" (I.e. brain states) cannot have a mental life. This seems problematic in light of the fact that lower animals, for example, an octopus, seem perfectly capable of experiencing mental states like pain without having a brain similar to ours. It also appears to be problematic in light of the possibilities that advanced machines or space aliens might have minds. Suppose, for example, that we encounter an alien space creature that is made entirely of previously unknown materials. We certainly would not be entitled to conclude it does not have a mind simply because it does not have a mind like we humans do. Although our own mental life seems intimately tied to our brains, the general claim that mental states are brain states appears suspect. This does not mean that we have to abandon the identity theory completely, but it does entail that we have to modify the theory to make room for the fact that other creatures besides humans could have minds."Imagine that your brain starts to deteriorate in such a way that you are slowly going blind. Imagine that the desperate doctors, anxious to alleviate your condition, try any method to restore your vision. As a last resort, they try plugging silicon chips into your visual cortex. Imagine that to your amazement and theirs, it turns out that the silicon chips restore your vision to its normal state. Now, imagine further that your brain, depressingly, continues to deteriorate and the doctors continue to implant more silicon chips. You can see where the thought experiment is going already: in the end, we can imagine that your brain is entirely replaced by silicon chips; that as you shake your head, you can hear the chips rattling around inside your skull. In such a situation there would be various possibilities. One logical possibility, not to be excluded on any a priori grounds alone, is surely this: you continue to have all of the sorts of thoughts, experiences, memories, etc., that you had previously; the sequence of your mental life remains unaffected...
Re: Theories of Consciousness
Searle nah, any Western philosophy argument can be refuted in seconds. Mental states = brain states is part of the universal claim: mental world = physical world. Just as the physical world doesn't end at the edge of your brain, the mental world doesn't end there either, the two are in fact one and the same. Your example assumes that mind is an individual and limited thing, so it's a strawman.anonymous66 wrote: ↑Sat Sep 28, 2024 5:52 pmThe author of Ultimate Questions in Philosophy Second Edition can explain the problem better than I can...Atla wrote: ↑Sat Sep 28, 2024 5:16 pmYes, it's pretty much the same one thing said twice.anonymous66 wrote: ↑Sat Sep 28, 2024 5:13 pm
Are you suggesting that mental states are identical to brain states?
"In spite of this attractiveness, the identity theory is open to objections. One crucial objection is well illustrated by the philosopher John Searle. Searle writesSearle's thought experiment points to a crucial weakness of the identity theory. The identity theory identifies mental states with brain states. It is thus committed to the general claim: Wherever there is no brain, there is no mind! But is this claim really plausible? The thought experiment clearly shows that it is logically possible that a being whose brain has been replaced by silicon chips has a mind. This shows that a strong version of the identity theory (sometimes called "type-type identity theory"), according to which mental states and brain states are necessarily identical, is false. The identity theory leads to a certain kind of chauvinism by insisting that beings without the appropriate "hardware" (I.e. brain states) cannot have a mental life. This seems problematic in light of the fact that lower animals, for example, an octopus, seem perfectly capable of experiencing mental states like pain without having a brain similar to ours. It also appears to be problematic in light of the possibilities that advanced machines or space aliens might have minds. Suppose, for example, that we encounter an alien space creature that is made entirely of previously unknown materials. We certainly would not be entitled to conclude it does not have a mind simply because it does not have a mind like we humans do. Although our own mental life seems intimately tied to our brains, the general claim that mental states are brain states appears suspect. This does not mean that we have to abandon the identity theory completely, but it does entail that we have to modify the theory to make room for the fact that other creatures besides humans could have minds."Imagine that your brain starts to deteriorate in such a way that you are slowly going blind. Imagine that the desperate doctors, anxious to alleviate your condition, try any method to restore your vision. As a last resort, they try plugging silicon chips into your visual cortex. Imagine that to your amazement and theirs, it turns out that the silicon chips restore your vision to its normal state. Now, imagine further that your brain, depressingly, continues to deteriorate and the doctors continue to implant more silicon chips. You can see where the thought experiment is going already: in the end, we can imagine that your brain is entirely replaced by silicon chips; that as you shake your head, you can hear the chips rattling around inside your skull. In such a situation there would be various possibilities. One logical possibility, not to be excluded on any a priori grounds alone, is surely this: you continue to have all of the sorts of thoughts, experiences, memories, etc., that you had previously; the sequence of your mental life remains unaffected...
-
Flannel Jesus
- Posts: 4302
- Joined: Mon Mar 28, 2022 7:09 pm
Re: Theories of Consciousness
Well yeah, that's one way of talking about it. Or process theory. Edit. Meant process philosophyanonymous66 wrote: ↑Sat Sep 28, 2024 5:06 pmAre you alluding to emergence? The idea that neural processes arise from a physical brain?Flannel Jesus wrote: ↑Thu Sep 26, 2024 6:41 pmThoughts can be the consequence of material processes, rather than just "material things". A process doesn't have to have a weight right?anonymous66 wrote: ↑Thu Sep 26, 2024 6:28 pm
This appears to be a problem for materialism. I understand materialism to be saying that everything can be defined in terms of its physical properties. If so, then what is the mass of a thought? How much space does it take up?
-
anonymous66
- Posts: 56
- Joined: Fri Jan 12, 2018 9:08 pm
Re: Theories of Consciousness
I don't really see the appeal or explanatory power of emergence - it's just basically saying "just take it as a matter of fact that mental states emerge from brain states".Flannel Jesus wrote: ↑Sat Sep 28, 2024 5:18 pmYes, anything that's not fundamental must be emergent. So minds must also be fundamental or emergent.anonymous66 wrote: ↑Sat Sep 28, 2024 5:06 pmAre you alluding to emergence? The idea that neural processes arise from a physical brain?Flannel Jesus wrote: ↑Thu Sep 26, 2024 6:41 pm
Thoughts can be the consequence of material processes, rather than just "material things". A process doesn't have to have a weight right?
-
anonymous66
- Posts: 56
- Joined: Fri Jan 12, 2018 9:08 pm
Re: Theories of Consciousness
If you're truly saying mental states = brain states, then how can you explain the reality that organisms with very different neurological systems all feel pain? Human brains are very different from the brains of octopus. And if you stick to your assertion that mental states = brain states, then aren't you a priori saying you know for a fact that "there can be no conscious organism without a brain"? Yet, I don't believe it is controversial to say, there is no logical reason why a conscious being like Korg (from the Marvel Universe) couldn't exist.Atla wrote: ↑Sat Sep 28, 2024 6:19 pmSearle nah, any Western philosophy argument can be refuted in seconds. Mental states = brain states is part of the universal claim: mental world = physical world. Just as the physical world doesn't end at the edge of your brain, the mental world doesn't end there either, the two are in fact one and the same. Your example assumes that mind is an individual and limited thing, so it's a strawman.anonymous66 wrote: ↑Sat Sep 28, 2024 5:52 pmThe author of Ultimate Questions in Philosophy Second Edition can explain the problem better than I can...
"In spite of this attractiveness, the identity theory is open to objections. One crucial objection is well illustrated by the philosopher John Searle. Searle writesSearle's thought experiment points to a crucial weakness of the identity theory. The identity theory identifies mental states with brain states. It is thus committed to the general claim: Wherever there is no brain, there is no mind! But is this claim really plausible? The thought experiment clearly shows that it is logically possible that a being whose brain has been replaced by silicon chips has a mind. This shows that a strong version of the identity theory (sometimes called "type-type identity theory"), according to which mental states and brain states are necessarily identical, is false. The identity theory leads to a certain kind of chauvinism by insisting that beings without the appropriate "hardware" (I.e. brain states) cannot have a mental life. This seems problematic in light of the fact that lower animals, for example, an octopus, seem perfectly capable of experiencing mental states like pain without having a brain similar to ours. It also appears to be problematic in light of the possibilities that advanced machines or space aliens might have minds. Suppose, for example, that we encounter an alien space creature that is made entirely of previously unknown materials. We certainly would not be entitled to conclude it does not have a mind simply because it does not have a mind like we humans do. Although our own mental life seems intimately tied to our brains, the general claim that mental states are brain states appears suspect. This does not mean that we have to abandon the identity theory completely, but it does entail that we have to modify the theory to make room for the fact that other creatures besides humans could have minds."Imagine that your brain starts to deteriorate in such a way that you are slowly going blind. Imagine that the desperate doctors, anxious to alleviate your condition, try any method to restore your vision. As a last resort, they try plugging silicon chips into your visual cortex. Imagine that to your amazement and theirs, it turns out that the silicon chips restore your vision to its normal state. Now, imagine further that your brain, depressingly, continues to deteriorate and the doctors continue to implant more silicon chips. You can see where the thought experiment is going already: in the end, we can imagine that your brain is entirely replaced by silicon chips; that as you shake your head, you can hear the chips rattling around inside your skull. In such a situation there would be various possibilities. One logical possibility, not to be excluded on any a priori grounds alone, is surely this: you continue to have all of the sorts of thoughts, experiences, memories, etc., that you had previously; the sequence of your mental life remains unaffected...
-
anonymous66
- Posts: 56
- Joined: Fri Jan 12, 2018 9:08 pm
Re: Theories of Consciousness
Looking back at my OP - Galen Strawson accepts property dualism and panpsychism.Flannel Jesus wrote: ↑Sat Sep 28, 2024 5:18 pmYes, anything that's not fundamental must be emergent. So minds must also be fundamental or emergent. Some materialists might think minds might be fundamental (which may disqualify them from being materialists, but who am I to say?), some may think minds are illusory (odd hypothesis, as it kind of presupposes a mind to experience an illusion, to have the illusion of a mind no?), and the rest are going to think it's emergent. Probably the majority, but I haven't run a survey.anonymous66 wrote: ↑Sat Sep 28, 2024 5:06 pmAre you alluding to emergence? The idea that neural processes arise from a physical brain?Flannel Jesus wrote: ↑Thu Sep 26, 2024 6:41 pm
Thoughts can be the consequence of material processes, rather than just "material things". A process doesn't have to have a weight right?
But - Galen Stawson (he wrote a paper with the title Physicalism entails Panpsychism) identifies as a materialist (or physicalist) - he agrees that there is only one type of matter - it's just that he accepts that Qualia are real, so physical stuff has mental properties. I find it hard to refute that sentiment.
I can see a possible progression of the terms materialism and phsyicalism. At one time, those who rejected all types of dualism were referred to as "materialists". But then as time went on - the term was changed to "physicalists" because of the advances in topics like spacetime, physical energies, forces, and exotic matter. Perhaps in the future, as we learn to accept the reality of Qualia, we can develop a new term that incorporates mental properties into our understanding of matter.
-
Flannel Jesus
- Posts: 4302
- Joined: Mon Mar 28, 2022 7:09 pm
Re: Theories of Consciousness
Well of course when you don't really try to understand something, you can try to boil it down to something that sounds really silly. That's not hard.anonymous66 wrote: ↑Sat Sep 28, 2024 7:55 pmI don't really see the appeal or explanatory power of emergence - it's just basically saying "just take it as a matter of fact that mental states emerge from brain states".Flannel Jesus wrote: ↑Sat Sep 28, 2024 5:18 pmYes, anything that's not fundamental must be emergent. So minds must also be fundamental or emergent.anonymous66 wrote: ↑Sat Sep 28, 2024 5:06 pm
Are you alluding to emergence? The idea that neural processes arise from a physical brain?
-
anonymous66
- Posts: 56
- Joined: Fri Jan 12, 2018 9:08 pm
Re: Theories of Consciousness
Maybe I just haven't found the right explanation of how emergence is supposed to work.Flannel Jesus wrote: ↑Sat Sep 28, 2024 8:36 pmWell of course when you don't really try to understand something, you can try to boil it down to something that sounds really silly. That's not hard.anonymous66 wrote: ↑Sat Sep 28, 2024 7:55 pmI don't really see the appeal or explanatory power of emergence - it's just basically saying "just take it as a matter of fact that mental states emerge from brain states".Flannel Jesus wrote: ↑Sat Sep 28, 2024 5:18 pm
Yes, anything that's not fundamental must be emergent. So minds must also be fundamental or emergent.
-
Flannel Jesus
- Posts: 4302
- Joined: Mon Mar 28, 2022 7:09 pm
Re: Theories of Consciousness
I don't think physical stuff needs to have mental properties, in order for physical processes to result in mental properties. In general, I don't think any large-scale thing that's composed of tiny things needs those tiny things to have the same properties as the large thing they compose. I don't think the things that make up a chair need chair-like properties. I don't think the things that make up a penis need penis-like properties. I don't think the things that make up a cell need cell-like properties, etc.anonymous66 wrote: ↑Sat Sep 28, 2024 8:34 pmLooking back at my OP - Galen Strawson accepts property dualism and panpsychism.Flannel Jesus wrote: ↑Sat Sep 28, 2024 5:18 pmYes, anything that's not fundamental must be emergent. So minds must also be fundamental or emergent. Some materialists might think minds might be fundamental (which may disqualify them from being materialists, but who am I to say?), some may think minds are illusory (odd hypothesis, as it kind of presupposes a mind to experience an illusion, to have the illusion of a mind no?), and the rest are going to think it's emergent. Probably the majority, but I haven't run a survey.anonymous66 wrote: ↑Sat Sep 28, 2024 5:06 pm
Are you alluding to emergence? The idea that neural processes arise from a physical brain?
But - Galen Stawson (he wrote a paper with the title Physicalism entails Panpsychism) identifies as a materialist (or physicalist) - he agrees that there is only one type of matter - it's just that he accepts that Qualia are real, so physical stuff has mental properties. I find it hard to refute that sentiment.
Emergence isn't an explanation, it's a category of explanations. Maybe that's why you feel unsatisfied by it - you're putting too high expectations on it being something it's not trying to be. It's just a category. Emergent explanations are explanations in which the component pieces don't have the same properties as the large things they compose, but yet the large-scale properties are still the natural consequence of the small component pieces anyway. Any explanation for which that is the case is an emergent explanation. So the word 'emergence' is not *the explanation itself*, just a description of the kind of explanation.
If you've ever tried to explain a large-scale behaviour based on the behaviour of interacting component pieces, you've done an emergent explanation.
-
anonymous66
- Posts: 56
- Joined: Fri Jan 12, 2018 9:08 pm
Re: Theories of Consciousness
After typing this I began to consider... Perhaps what Dennett did in his book Bacteria to Bach... could be considered to be an explanation of how consciousness emerges in the human brain. If so- I find his ideas to be plausible.anonymous66 wrote: ↑Sat Sep 28, 2024 8:39 pmMaybe I just haven't found the right explanation of how emergence is supposed to work.Flannel Jesus wrote: ↑Sat Sep 28, 2024 8:36 pmWell of course when you don't really try to understand something, you can try to boil it down to something that sounds really silly. That's not hard.anonymous66 wrote: ↑Sat Sep 28, 2024 7:55 pm
I don't really see the appeal or explanatory power of emergence - it's just basically saying "just take it as a matter of fact that mental states emerge from brain states".
But his ideas are still lacking in that they don't explain how it is that consciousness emerges from unconscious matter in the first place.
Re: Theories of Consciousness
Ehh how do you know what other organisms feel? But I don't think nervous systems are all that dissimilar, they are electric. Pain is obviously not a singular feeling, but a range/continuum of similar feelings, or multiple continuums (different pains).anonymous66 wrote: ↑Sat Sep 28, 2024 8:12 pm If you're truly saying mental states = brain states, then how can you explain the reality that organisms with very different neurological systems all feel pain? Human brains are very different from the brains of octopus.
I don't know about this Korg, does it have no brain? And which meaning of conscious are you using here?And if you stick to your assertion that mental states = brain states, then aren't you a priori saying you know for a fact that "there can be no conscious organism without a brain"? Yet, I don't believe it is controversial to say, there is no logical reason why a conscious being like Korg (from the Marvel Universe) couldn't exist.
-
anonymous66
- Posts: 56
- Joined: Fri Jan 12, 2018 9:08 pm
Re: Theories of Consciousness
Let's consider this for a while.. Do you accept that an octopus feels pain?Atla wrote: ↑Sat Sep 28, 2024 8:54 pmEhh how do you know what other organisms feel? But I don't think nervous systems are all that dissimilar, they are electric. Pain is obviously not a singular feeling, but a range/continuum of similar feelings, or multiple continuums (different pains).anonymous66 wrote: ↑Sat Sep 28, 2024 8:12 pm If you're truly saying mental states = brain states, then how can you explain the reality that organisms with very different neurological systems all feel pain? Human brains are very different from the brains of octopus.
Re: Theories of Consciousness
Yesanonymous66 wrote: ↑Sat Sep 28, 2024 8:59 pmLet's consider this for a while.. Do you accept that an octopus feels pain?Atla wrote: ↑Sat Sep 28, 2024 8:54 pmEhh how do you know what other organisms feel? But I don't think nervous systems are all that dissimilar, they are electric. Pain is obviously not a singular feeling, but a range/continuum of similar feelings, or multiple continuums (different pains).anonymous66 wrote: ↑Sat Sep 28, 2024 8:12 pm If you're truly saying mental states = brain states, then how can you explain the reality that organisms with very different neurological systems all feel pain? Human brains are very different from the brains of octopus.