Ginkgo wrote:Hello again Gee.
The reason I mentioned dividing things into hard and easy problems was because I was hoping you could concede a few points to the physicalists. In this day and age we know that brains behave a bit like computers. They can have a memory, they are rational, have a sleep mode, as well as carrying out a logical process.
I would love to concede a few points to the physicalists, but have a problem with it, as they tend to get carried away with their ideas. Example: "brains behave a bit like computers." This is not true. The truth is that computers behave a bit like a very small part of the rational mind part of the brain. The problem that I have with physicalists is that they like to ignore points that I think are relevant.
Ginkgo wrote:But how much can we concede to the physicalist explanation? In this day and age computers are not emotional. Dennet would say yes, but that is only true at the moment but it wont be true in the future. The workings of a machine will, in the future, account for all of the feelings and emotions humans have.
Is that so? The interesting thing about the way that I study consciousness, is that I learn
how things work. So I am aware that emotion is external; it, like the sub/unconscious mind, is communal, shared, outside of the body. So if a computer had this quality of emotion, then it would also have a "personal space", it could be read by a person who reads auras, it's best friend could take one look at it and know that it needed cheering up.
So you are telling me that Dennett thinks that a computer can do this?
Ginkgo wrote:Now this is where it gets interesting from Chalmers' point of view. Chalmers would agree that it may well be the case that now and into the future all of the things we call consciousness can, and perhaps, be accounted for in terms of matter in motion. The whole business is just physical processes at work. But all of these things Dennett would call the easy problem suffer from a major problem. That is the problem of experience. Chalmers would probably say that no matter how sophisticated the machine, a mechanical explanation will never explain experience
Experience is a very small part of the problem. As I mentioned above emotion does not work internally, neither does awareness, neither does feeling, and if by some miracle we could make a computer that was emotional, aware, and had feelings, then it would be alive, and we would have to deal with the greatest problem. All life wants, and at the least it wants to survive and continue. What kind of a nightmare would that be?
Ginkgo wrote:For Chalmers the hard problem is the easy problem as well. Perhaps we could say that consciousness is like a coat that fits around the mechanical explanations. Keeping in mind that it is possible that a mechanical explanation covers all types of consciousness.
There might be a metaphysical explanation for consciousness, but I seriously doubt that there can be a mechanical explanation.
Ginkgo wrote:This is why Chalmers introduced the idea of a philosophical zombie. The zombie is the machine, is the human. The zombie is exactly like a human with a computer brain, just like every human might be explained in these terms. The only difference being that the zombie lacks experience. It doesn't have ,"What it is like experiences" So when I abuse a philosophical zombie by calling him names, he will like any human be offended and sad. The difference being that the zombie is not really offended or sad. It is just the appropriate mechanical mechanism taking place in the brain. So the zombie ACTS hurt and offended where as a real person IS hurt and offended.
Agreed. There can be an imitation of the brain, and an imitation of consciousness, but it is not real.
Ginkgo wrote:Would this be how you understand the hard and easy problems? Because this is where I was trying to lead you.
Pretty much. I would like to read Dennett's "Consciousness Explained", just to see how he justifies his assumptions. But I would also like to read, Chalmer's "Hard Problem" to see how close he is. I have not yet tried to read them, because I am not sure that I can. Do either of them write with plain speaking, or do they feel the need to put 16 different "isms" and "ologys" in each paragraph? I ask because I now have problems learning new terminology; one of my many gifts from multiple sclerosis.
In your next post, you apologized for misreading my intentions. No apology is necessary, as if there is fault, it is mine. I find your writing interesting and appreciate your responses.
Gee