Materialism is logically imposible

So what's really going on?

Moderators: AMod, iMod

A Human
Posts: 51
Joined: Mon Sep 12, 2016 6:01 am

Re: Materialism is logically imposible

Post by A Human »

bahman wrote:Materialism is a system of belief which claims that everything is constituted of matter and any motion of matter can be described by laws of nature. In close form, S'=L(S), where S is the initial state, S' is final state and L is laws of nature. There is however an anomaly in this system of view so called consciousness, C, which is simply the awareness of surrounding. C is simply the expectation of what S' should be. Materialist believe that C can be derived from S by the following equation C=P(S) where P is the act of experience. There is however no reason to believe that there exist a relation between C and S' in this framework. We however always observe a fantastic correlation between what we expect to happen, C, and what happens, S'. This means that we are dealing with a logically impossible situation since C could be anything.

Your thought?
You're mixing physics and mental phenomena, that's where the problem lies.

The smallest unit of physics is energy, the smallest unit of mental is difference.

It's not even like comparing apples and oranges, far beyond that.

(and wtf with moderation time delays, are we talking or like in the 80's with email that takes days to be delivered?)
A Human
Posts: 51
Joined: Mon Sep 12, 2016 6:01 am

Re: Materialism is logically imposible

Post by A Human »

bahman wrote:Materialism is a system of belief which claims that everything is constituted of matter and any motion of matter can be described by laws of nature. In close form, S'=L(S), where S is the initial state, S' is final state and L is laws of nature. There is however an anomaly in this system of view so called consciousness, C, which is simply the awareness of surrounding. C is simply the expectation of what S' should be. Materialist believe that C can be derived from S by the following equation C=P(S) where P is the act of experience. There is however no reason to believe that there exist a relation between C and S' in this framework. We however always observe a fantastic correlation between what we expect to happen, C, and what happens, S'. This means that we are dealing with a logically impossible situation since C could be anything.

Your thought?
'C' isn't about awareness, it's about aboutness.
User avatar
Greta
Posts: 4389
Joined: Sat Aug 08, 2015 8:10 am

Re: Materialism is logically imposible

Post by Greta »

Noax wrote:I hadn't seen it described as emotion before, which doesn't just seem to be different synaptic states, but also a chemical change. In that sense, there is little outside of biology to which emotional attributes might be attributed.
Your comments speak quite a bit of subconscious functions and their interactions with the conscious as defined as "not subconscious". There are several other definitions, and the consciousness of the hard problem is not always defined as "not subconscious". My subconscious seems to have qualia for instance. There is also 'awake' as opposed to the lowered-consciousness 'asleep' or absent consciousness of 'unconscious' of being knocked out.
It only occurred to me recently that qualia and emotion seem to be inseparable. I see little difference, if any, between Chalmers's philosophical zombies and the idea of an entirely emotionless cognition, aside from, say, programmed emotional play-acting by AI for the sake of human comfort. I note that there is a general tendency to think of anything that lacks emotionalism as being machine-like, eg. microbes are often referred to as "biological robots".

Maybe the unconscious qualia you referred to are micro-emotions, sensory impressions too mild to trigger emotional responses? Those little niggles that are not consciously noticed.

In lieu of any kind of emotional response, intelligent AI could be programmed to detect thresholds of damage to that which it deems itself responsible and to respond, not with pain but simply a signal. The signal would trigger a chain reaction. However, rather than emotion driving and motivating subsequent actions, the "middleman" of emotion would be bypassed and, on detecting potential damage or loss, the machine would shift immediately to diagnostics and action.
Noax wrote:
Perhaps the human kind of consciousness will produce emergent phenomena, just as cephalised, conscious animals emerged from purely reflexive life forms. All contemplative traditions refer to selfless Zen-style flow states and meditative states which are keenly focused in the present moment, unhindered by mental and emotional static. This seems to me to not be miles from how advanced future humans or general AI or may operate - with far more efficient, unhindered focus.
Why might said unhindered focus be a better thing? My son has unusually unhindered focus. Does amazing things, but he's all the less fit for survival because of it. Evolution would have him in a hour were it not for my efforts.
Yet his strong focus would be helpful if he had control over it and was capable of harnessing it more flexibly.
Noax wrote:
Greta wrote:Are you saying that because science requires a relatively objective third perspective?
One can always experiment in first person. Clues are gathered from the first person experience of everyday life, from first person accounts of abnormalities (surgery, diseases, etc.). In the end, the data gathered in this manner is evidence and thus science. The interpretation of it is not. I find the examples I just stated to be strong empirical evidence, yet IC considers there to be absolute zero evidence. What science discovers is therefore not the hard part of the problem. That's about the best I can word it. If the hard problem is examined only by logic, one still must take empirical knowledge as a foundation of the logic.
I think this issue with IC comes down to the Mary's Room thought experiment, where the scientist Mary has monochrome vision but learns everything possible about colour. Then she is granted colour vision (surgery? whatever). Will she have learned something new? Generally the answer comes down to, no matter how much you learn, it's always only a sketchy approximation of experience, knowing how something feels. Of course, experience itself is only a sketchy approximation of the actual reality, Kant's noumena.

So we all sketch away in whatever manner suits us at the time :)

As mentioned earlier, rigorous study of subjective perspectives has been done by Buddhists, although obviously there's some metaphysical distractions to work past. Time is ripe for a similar subjective approach with western rigour. For people to simply record their experiences without metaphysical extrapolations. While neural mapping is potent, there are limitations involved in mapping the mental dynamics of subjects in a laboratory setting. That is, the experimenter only ever measures the kinds of self-conscious mental states possible for a subject who is stuck in a sterile laboratory, acutely aware that her every impulse is being observed, and all the while her head is infested by annoying gizmos with wires attached to a machine that goes beep every eight seconds. (A bit of poetic licence but you know what I mean :)
Noax wrote:
I see no reason why the link between neuronal states and qualia can't be teased out with more sophisticated tools. After all, illness was caused by evil spirits before bacteria were discovered. I wouldn't be surprised if what we think of as qualia pertains to quantum or Planck scale states.
The links have been observed, and they operate on macroscopic scale. There seem to be no quantum level constructs in biology, which is too bad since such constructs would have evolved if there was useful information to be gathered there.
There is no known causative link between the dynamics of non conscious cells and the experience of consciousness. When the claim is made that researchers already understand how consciousness works thanks to neural mapping the usual reply is that that refers only to correlation, not causation.

It's not that quantum level constructs apply on a macro scale - obviously - but that multiple quantum effects in the brain are involved with the brain's function and, according to Penrose and Hameoff, neuronal microtubules may play a pivotal role: https://www.sciencedaily.com/releases/2 ... 085105.htm
Noax wrote:
Emotions and morality appear to be key aspects of qualia. Take them away and you have sensory and abstract processing that could theoretically be readily replicable by advanced general AI.
This seems to confine possible consciousness to humans only, at least if we're the only ones that are moral. Or could a different thing be conscious if it had a moral code of its own, albeit a completely different ones than ours? Morality seems a social construct to me. I am conscious of the way interactions between others should be done.'
Cows have morality - cooperators and troublemakers - ditto numerous other species. There's a famous Capuchin monkey clip, showing their clear grasp of fairness.
Noax wrote:
Still, without emotion, there are only algorithms - primary directives - to "motivate" AI to do anything. In a sense, when emotional human life is no longer viable on the Earth the only point to the persistence of nonhuman creations would be a sense of duty to perpetuate the Earth's story, to propagate new biospheres, to continue a new evolutionary line from self improving intelligent machines, or we may simply send informational packages into space to maybe be be picked up by future advanced life, keeping some of the Earth's story alive even after it's been engulfed by the Sun in five billion years' time.
That was sort of the long term goals for which I was searching. I like that the goals of humanity were not even mentioned in that description. What is good on Earth that the rest of the universe might wish to have perpetuated?
Yes. Humanity, like all species, is either transitional or a dead end, never a destination. If transitional, then offworld AI may continue the line. What might evolve from self improving AI would be beyond our ken, which is an exciting thought.
User avatar
attofishpi
Posts: 13319
Joined: Tue Aug 16, 2011 8:10 am
Location: Orion Spur
Contact:

Re: Materialism is logically imposible

Post by attofishpi »

Greta wrote:Yes. Humanity, like all species, is either transitional or a dead end, never a destination. If transitional, then offworld AI may continue the line. What might evolve from self improving AI would be beyond our ken, which is an exciting thought.
For me it has already happened. Either God is AI or divine, either way, IT exists and has left some fingerprints:-

Mount SINAI
Image


ALPHABET: AI UO e
Image


Image
Ginkgo
Posts: 2657
Joined: Mon Apr 30, 2012 2:47 pm

Re: Materialism is logically imposible

Post by Ginkgo »

Noax wrote:Define pain then. It notices the undesired action and reacts appropriately to it. It is not a human feeling or reaction, but the hard problem is not about being human. The pinball machine feels pain.

Not trying to be difficult. I suspect the problem would be solvable and not 'hard' if simple definitions existed for what exactly is hard to explain. But my view has been that if nobody can identity exactly what is missing from the pinball machine, that perhaps no such distinction exists. All I see is designations of "I have it, but example X doesn't". Nobody even defines the line where things stop having it. Just: This thing (pinball machine) is clearly on the other side of this arbitrary line I refuse to define.

It doesn't know what it is like to feel human pain or hunger and you don't know what it is like for it to feel a TILT or for a laptop to feel low battery. But bottom line is the pinball machine very much does know, else it would not react to the tilt. So we seem to be in pursuit of the wrong thing in asserting the pinball machine doesn't know, or we fail to have an acceptable definition of 'knowing'. Give me one that distinguishes between conscious and not, because I seem to be able to do this analysis with any example.
Fortunately a definition is at hand. Some materialists would say that pain is a disposition to behave in a certain way. There is no doubt we could build a machine to exhibit all the outward signs of feeling pain.As far as these materialists are concerned there is nothing more that can be added to the quality of pain. Having said that, would you be happy to claim your pain is just the way you behave or is there something more than just outward behaviour?
Noax wrote: I think we're been discussing the easy problem, yes.
Yes, we have and p-zombies and computers are all included under this heading.
Noax wrote: Since I don't know what it is like for the laptop to feel low on battery, I see the two situations as exactly symmetrical.
Computer science at the moment is at a level that falls well short of machine having any feeling or emotions.
Noax wrote: I think not. I find it impossible to grasp that a p-zombie would function undetectably the same without what I personally would consider to be human consciousness. Hence my assertion that there really is something extra, and my inability to see it means I am such a p-zombie. That would actually make sense to me. The Chalmers picture is one where the zombies are quite capable of functioning without the consciousness. I am missing several body parts, including some I'd rather not have lost. But if the consciousness was taken, I don't think what was left would function at all.
The p-zombie functions at the level of the easy problem. I am of the opinion that it would not be able to function at all without the inclusion of the hard problem.
User avatar
attofishpi
Posts: 13319
Joined: Tue Aug 16, 2011 8:10 am
Location: Orion Spur
Contact:

Re: Materialism is logically imposible

Post by attofishpi »

Noax wrote:Define pain then. It notices the undesired action and reacts appropriately to it. It is not a human feeling or reaction, but the hard problem is not about being human. The pinball machine feels pain.

It doesn't know what it is like to feel human pain or hunger and you don't know what it is like for it to feel a TILT or for a laptop to feel low battery. But bottom line is the pinball machine very much does know, else it would not react to the tilt.
ARE YOU ACTUALLY SERIOUS? A pinball machine - or any other machine DOES NOT FEEL PAIN.

Have you ever programmed a machine?

A robot and a monkey both have their hands placed on the desk in front of me. I strike both 'hands' with a hammer.

My question is - which one, the robot or the monkey is going to actually FEEL pain?
User avatar
Hobbes' Choice
Posts: 8360
Joined: Fri Oct 25, 2013 11:45 am

Re: Materialism is logically imposible

Post by Hobbes' Choice »

Noax wrote:
Hobbes' Choice wrote:Materialism is logically necessary.

Existence has to have substance.
matter is the only substance
Matter makes existence substantial.
While I'm hardly opposed to the conclusions, how do you justify this post?

Immaterial mind doesn't count as a different sort of substance, or a different set of properties of substance, or a substance but not an existing one?

I would say dualism is logically unnecessary, but you go further with the positive assertion.
I do not justify this post.
This post is irony.
Materialism is not true or false on the basis of any "LOGIC". I was simply demonstrating the absurdity of the thread heading.
But since you have addressed the point.
"Immaterial mind" is not substantial. Not s different sort of substance, just no substance at all.
As for dualism - you can assert the logical necessity of that too.

Most people, even most people that contribute to this Forum mistake the use, value, and purpose of LOGIC.
User avatar
Noax
Posts: 851
Joined: Wed Aug 10, 2016 3:25 am

Re: Materialism is logically imposible

Post by Noax »

Ginkgo wrote:Fortunately a definition is at hand. Some materialists would say that pain is a disposition to behave in a certain way.
Thought we debunked that definition. I claim a machine feels hunger. The laptop makes no attempt to imitate human hunger behavior. The point of hunger is not to pass a Turing test.
There is no doubt we could build a machine to exhibit all the outward signs of feeling pain.As far as these materialists are concerned there is nothing more that can be added to the quality of pain. Having said that, would you be happy to claim your pain is just the way you behave or is there something more than just outward behaviour?
No, not behavior. Pain is how it feels, but how it feels to a human is a human thing, and the similarity of the experience of pain seems to be a function of the similarity to a human. An ape probably feels very human-like pain, where an insect or a robot experiences it quite differently. None of that suggests that human pain requires some entity not required by something else to perform the same function.
Computer science at the moment is at a level that falls well short of machine having any feeling or emotions.
I didn't say emotion. That seems to be a chemical difference. It is unclear if emotions are due to chemical changes, or the chemical changes are due to emotions. Probably considerable feedback, making it a bit of both. Emotions can be altered with introduction of chemicals, and the onset of said emotions seems completely out of conscious control. I cannot convincingly act out an emotion that I don't hold. I cannot voluntarily dump the various chemicals in my bloodstream. If one could, there would be no need of roller coasters and slot machines.
As for feelings of hunger which I do not classify as an emotion, my assertion stands that a laptop experiences it. Better programming is not required. As a naturalist, I work from the assumption that a human is just another machine like the laptop, albeit more complex.
Give me a definition that makes the distinction, which is not simply "experiences human hunger". Why should it? It isn't human. Why is an entire separate immaterial ontology required for a human to do what the laptop can do without it? This is what I'm attempting to answer with this inquiry.
The p-zombie functions at the level of the easy problem. I am of the opinion that it would not be able to function at all without the inclusion of the hard problem.
Chalmers said otherwise. The entire argument seemed to hinge on the plausibility of the p-zombie being able to function undetectably. If it does, clearly the mind is optional and adds nothing to the fitness of the creature.

That makes it epiphenomenal, except for the ability to talk about it. I admit I talk about it only in imitation. So maybe I really don't experience what is so obvious to the ones that do. Why must p-zombies be a thought experiment? Why can't I be one?
Last edited by Noax on Mon Sep 12, 2016 4:44 pm, edited 1 time in total.
User avatar
Noax
Posts: 851
Joined: Wed Aug 10, 2016 3:25 am

Re: Materialism is logically imposible

Post by Noax »

Hobbes' Choice wrote:I do not justify this post.
This post is irony.
I've already demonstrated my inability to see irony in posts... :oops:
User avatar
Noax
Posts: 851
Joined: Wed Aug 10, 2016 3:25 am

Re: Materialism is logically imposible

Post by Noax »

attofishpi wrote:ARE YOU ACTUALLY SERIOUS? A pinball machine - or any other machine DOES NOT FEEL PAIN.
Do you just respond to that one snippet or are you following the conversation?

Tell me why the robot doesn't feel pain. All you're doing is asserting it in all caps. That doesn't make the point stick any harder.
I'm looking for a statement of the hard problem, not just an unfounded assertion that there is one.
User avatar
Terrapin Station
Posts: 4548
Joined: Wed Aug 03, 2016 7:18 pm
Location: NYC Man

Re: Materialism is logically imposible

Post by Terrapin Station »

Immanuel Can wrote:a) I don't know what your "ontology of logic" is. You haven't said. The entire concept sounds dubious. Still, feel free to expound your view.
Re ontology of logic in general, as long as one doesn't believe that there is no such thing as logic, one will have beliefs (well, assuming one had bothered to think about it) regarding just what logic is as an existent--what sort of existent is it? Where, if anywhere, is it located? (And if it isn't located anywhere, presumably something like a general theory of the ontology of abstract existents would be pertinent.)

My ontology of logic has it as simply a way that individuals think about the world, in terms of the most abstract, generalized relations, where those abstract generalizations are consistent with the individual's experience of the world. So the location of logic in my view is particular brains, and it doesn't exist aside from individuals thinking about the world in that way.

I use the term "rationality" so that it's co-extensive with "logic" by the way.

So that's why you can't do something other than what some individual(s) thinks when you analyze something rationally or logically.
User avatar
Immanuel Can
Posts: 27609
Joined: Wed Sep 25, 2013 4:42 pm

Re: Materialism is logically imposible

Post by Immanuel Can »

Terrapin Station wrote:My ontology of logic has it as simply a way that individuals think about the world, in terms of the most abstract, generalized relations, where those abstract generalizations are consistent with the individual's experience of the world.
Well, I wonder, how is "logic" different for you from "experience"? Are they identical, as you see it?
So the location of logic in my view is particular brains, and it doesn't exist aside from individuals thinking about the world in that way.
Are mathematical principles the same? Do they exist only in the beliefs of particular brains, would you say? Or are mathematics independent of perception; and if they are, why are logical precepts different?
I use the term "rationality" so that it's co-extensive with "logic" by the way.
Philosophers do tend to separate the two a bit. But you blend them? Was there a reason for that?
So that's why you can't do something other than what some individual(s) thinks when you analyze something rationally or logically
.
But if that were true, then would not agreement about anything be either a) impossible, or b) merely contingent, as when two people just "happen to agree" but for no reason?

If there is no such thing as "logic," and no "reasons" that are not individual, then how are things like communication, shared values or rational agreement possible on a non-contingent basis? Is there no possibility of rational exchange, since "rationality" itself is then merely the sum of personal prejudices?

In other words, how do any two people actually solve anything? :shock:
User avatar
Terrapin Station
Posts: 4548
Joined: Wed Aug 03, 2016 7:18 pm
Location: NYC Man

Re: Materialism is logically imposible

Post by Terrapin Station »

Immanuel Can wrote:Well, I wonder, how is "logic" different for you from "experience"?
No, they're not at all identical. You'd only experience the most abstract, generalized relations in the sense of experiencing your own thought about the most abstract, generalized relations. Most experience is not experience of that.
Are mathematical principles the same? Do they exist only in the beliefs of particular brains, would you say?
Yes. I've stated a number of times that I'm an anti-realist on mathematics. Mathematical "objects" are not mind-independent. They're a way that we think about the world.
Philosophers do tend to separate the two a bit. But you blend them? Was there a reason for that?
The separation is typically based a realist position about logic.
But if that were true, then would not agreement about anything be either a) impossible, or b) merely contingent, as when two people just "happen to agree" but for no reason?
Obviously I'd not believe that agreement was impossible. Re contingent "as when two people just 'happen to agree' for no reason," it's not as if persons' brains function completely randomly (including with respect to other persons). One reason for agreement is that people can think similarly.
If there is no such thing as "logic,"
It's not that there's no such thing as logic. There is such a thing as logic. The sort of thing it is is a way we think about the most abstract, generalized relations.
and no "reasons" that are not individual, then how are things like communication, shared values or rational agreement possible on a non-contingent basis?
Again, if you're meaning something like "arbitrary" in a "random" sense by "contingent" for some reason, no one is saying that that's the case. Brains don't work randomly.
Is there no possibility of rational exchange, since "rationality" itself is then merely the sum of personal prejudices?
Rationality being how people logically think isn't incompatible with there being a rational exchange.
uwot
Posts: 6092
Joined: Mon Jul 23, 2012 7:21 am

Re: Materialism is logically imposible

Post by uwot »

Immanuel Can wrote:...how do any two people actually solve anything? :shock:
If you are one of them, Mr Can; they don't.
User avatar
Immanuel Can
Posts: 27609
Joined: Wed Sep 25, 2013 4:42 pm

Re: Materialism is logically imposible

Post by Immanuel Can »

Thanks for the insight on your view. I appreciate it.
Post Reply