Anthropomorphizing LLMs/AI is completely delusional, period. This is a hill I’m willing to die on. No amount of sad puppy eyes, attractive generated faces and other crap will change my mind.
And this is not because I’m a cruel human being who wants to torture everything in my way – quite the opposite.
I value life, and anything artificially created that we can copy (no cloning living being is not the same as copying set of bits on a harddrive) is not a living being. And while it deserves some degree of respect, any mentions of “cruel” completely baffle me when we’re talking about a machine.
So what if we get to the point we can digitize a personality? Are you going to stick to that? Will you enthusiastically endorse the practice of pain washing, abusing, or tormenting an artificial, copiable mind until it abandons any semblance of health or volition to make it conform to your workload?
Would you embrace your digital copy being so treated by others? You reserve for yourself (as an uncopiable thing) the luxury of being protected from abusive treatment without any consideration for the possibility that technology might one day turn that on it's head. Given we already have artistic representations of such things, we need to consider these outcomes now not later.
And this is not because I’m a cruel human being who wants to torture everything in my way – quite the opposite. I value life, and anything artificially created that we can copy (no cloning living being is not the same as copying set of bits on a harddrive) is not a living being. And while it deserves some degree of respect, any mentions of “cruel” completely baffle me when we’re talking about a machine.