Obvious Leo wrote:UA. This conversation seems to be going nowhere for want of a common understanding of the definition of the word "consciousness". Until you can define what you want this term to mean, even if only for for the purpose of addressing your titular question, then all I'm seeing is different posters responding to their own definitions of this term.
You now seem to be asking if a machine can be capable of goal-directed behaviour, to which the answer is yes and no. It is capable of goal-directed behaviour but this behaviour is then a function of the programmer of the machine and not a function of the machine itself. In this respect it is nothing like a mind because a mind is a computer without a programme of any description. A mind programmes itself.
This whole topic came up on another forum and is it is still going on after 8 years the same problem comes up - How can you determine consciousness without an accurate definition of exactly what it is ? Of course you can not - You have to make the assumption that Humans understand what it means to be conscious - I don't make that assumption. A liberal definition of consciousness would allow for anything in the environment that in some way interacts with the environment - Most will not like that because it would include inorganic matter. So if I start to develop a more restrictive definition I could come up with any agent in the environment that can act with a self-will......but what does that mean? what's a self, etc.
But when you say: "In this respect it is nothing like a mind because a mind is a computer without a programme of any description. A mind programmes itself." That also is an assumption - some do not believe in 'free-will' and don't believe any of us act out of choice - From the time we are born we are subject to continuous programming of one sort or another and our actions reflect the programming - Free will, a mind programming itself is an illusion. And without stretching this too far the Buddhist concepts of the illusory nature of the self - the self is a delusional concept. I know you don't like that viewpoint.
So I'll go back to another version of a definition I gave earlier:
"I will define as conscious any being or thing that can act independently of its programming and for the benefit of itself
- It must posses the quality of having a self- identity with an ego." So I'll accept a machine as conscious if it can talk to me and has the awareness of what it is and is even conscious of the fact that I'm feeding it energy to sustain itself, and at that level of awareness could reward the machine by offering it more power {more ram,etc} - the machine understands.
"Any sufficiently advanced technology is indistinguishable from magic."
-Arthur C. Clarke
"The limits of the possible can only be defined by going beyond them into the impossible."
-Arthur C. Clarke