A hypothetical: A person is crying near an intelligent machine. It calculates that the person is operating sub-optimally, subroutine "Sadness". The data banks say that putting an arm around a person and saying, "Feel free to talk it out, but only when you're ready. Is there anything I can do for you?" is positive action.
That's ostensibly the same as what humans do, and not necessarily worse. Many a comforter has rolled her eyes while embracing some poor emotional wreck - ticking off the correct boxes much like a machine, and without much emotional resonance, more a faux emotional resonance which appears to be quite effective in all but situations of life-changing loss and grief.
Thing is, what motivates robots? It's our micro-emotions that drive every moment of our actions without being noticed. Without those micro-emotions you could not write a post, or even get out of bed. To a fair extent I see those microemotions as controlled disequilibrium, an evolved restlessness because the restless tend to survive and breed better then the restful.
So life is essentially a constant jangle of varying degrees of restlessness. At this stage, the only thing driving robots is human restlessness transferred into them; they reflect our drives and desires. Seemingly all that can drive them without us are core aims or directives, which could be thought of as their laws, ethics and creeds. I am not sure it would be wise to truly infuse them with our restlessness, which would seem suboptimal for all parties.
Thus, functionally, being a robot seems like a good option, but that leaves us with the question of internality. The self. To quote Max Tegmark:
But what aspects of our consciousness are truly valuable in reality, and not just to us humans?Of all traits that our human form of life has, I feel that consciousness is by far the most remarkable. As far as I’m concerned, it’s how our Universe gets meaning, so if our Universe gets taken over by life that lacks this trait, then it’s meaningless and just a huge waste of space.
I have a little "theory" (ie. insane speculation) that the Earth may have created biology simply as a regular stage of its growth towards its mature adult form that is capable of reproduction, ie. sending its DNA/other characteristics offworld to spread elsewhere. (And I don't think humans will go off-world because we are too fragile and won't cope socially - so it will be intelligent machines exploring and settling in space, no doubt trying to seed various worlds with biota from Earth).
So all of this theatre of the mind that we value so greatly may in fact be a sham of sorts, lesser than that over which we claim superiority (eg. stars, planets, potentially AI) because we value sentience over capacity. Yet sentience may not be "the main game" of reality but a deeper functionality that requires too much of an entity's resources to allow it the hubristic luxury of a "self"?
Meanwhile most wisdom traditions, along with modern scientific studies, tell us that total absorption to the point of losing the self is the happiest and most productive way of being, eg. Zen, meditation, the great thinkers and artists. Aside from complexity, how does such a state differ from a) other species or, b) an AI that passes the Turing test?
Having said all that, personally it feels to me like Tegmark is right and that sentience is the main game of reality, but I would add the rider - so far.
Gosh, I babbled a lot there. It was fun