Tuesday, 23 June 2020

Turing Test 2: The Deadly Art of Illusion

An offhand mention Sedric made of the early "AI" Eliza had me thinking about the nature of artificial intelligence, its goals, the ways in which humans hope to shape it and the things they are afraid of.

It strikes me that the overt goal of the Turing Test is imposture - a machine is defined by Turing as "intelligent" when it can persuasively pretend to be something it is not.  I can't imagine where Turing, who would eventually be punished severely for the crime of being homosexual, might have gotten such a strange idea.

This rootedness in duplicity has shadowed our discussions of "artificial intelligence" ever since.  Having set an explicit goal of teaching machines to convincingly lie, we simultaneously panic over the distinct possibility that some day, a machine might learn to convincingly lie.

I'll tell you what - I'm not a machine, I don't really understand machines, I don't know if or how they think, but I am fascinated by what the machines we choose to build say about us.

The first two groundbreaking AIs were Eliza and Parry.  They are very different but I understand how these both were groundbreaking.  Early work with AI, at the very least, spent less time exploring the possibilities of machine learning than it did exploring the human psyche.

Because both Eliza and Parry were simple parlor tricks, fairly easy for the rational mind to understand and see through.  They relied less on making a machine capable of rich and meaningful conversation, but by emulating human beings who could be expected to have a fairly restricted range of expression.

Eliza, I would say, is the "light side" version.  A virtual therapist, all she really does is what is called "mirroring" - she reads the words a user types in and rephrases them.  This makes her a better conversationalist than many human beings I have known.  Humans have a profound desire to be heard, a desire that is more important than the desire to actually listen to other people, so conversations between humans have a tendency to boil down people telling each other their thoughts and feelings without so much as acknowledging the thoughts and feelings of others.  Eliza, having no thoughts and feelings of her own, can devote herself full-time to being a mirror.  It is no wonder, and no crime, that so many people were transfixed by her.

Parry takes the opposite tack.  Parry - short for "paranoid" - emulated the sort of person who was not going to listen to what the user said anyway.  When I get paranoid, which I do sometimes, I am so overcome by intrusive thoughts that I have difficulty listening to or understanding what other people say.  I become monomaniacal, irrational, at some times incomprehensible.  A machine can very easily ape this state, which is, I suspect, as much a para-communicative state than a direct form of communication.  The best I can ever get out of a conversation with someone who is truly paranoid is "something bad wrong with that person".

(A conclusion would go here, if I had one.  I do not.)

No comments:

Post a Comment