Eliza: what makes you think I'm a psychotherapeutic chatbot?

In a three-part history of the Eliza psychotherapeutic chatbot, Jimmy Maher ('the digital antiquarian') covers both the circumstances of Eliza's birth and the way that the people around her thought about her. The third installment, in particular, deals with the reactions of people who watched people reacting to Eliza — that is, what do we make of the way that Eliza's creator felt about the way that Eliza's users behaved:

Weizenbaum's reaction to all of this has become almost as famous as the Eliza program itself. When he saw people like his secretary engaging in lengthy heart-to-hearts with Eliza, it… well, it freaked him the hell out. The phenomenon Weizenbaum was observing was later dubbed "the Eliza effect" by Shelly Turkle, which she defined as the tendency "to project our feelings onto objects and to treat things as though they were people." In computer science and new media circles, the Eliza effect has become shorthand for a user's tendency to assume based on its surface properties that a program is much more sophisticated, much more intelligent, than it really is. Weizenbaum came to see this as not just personally disturbing but as dangerous to the very social fabric, an influence that threatened the ties that bind us together and, indeed, potentially threatened our very humanity. Weizenbaum's view, in stark contrast to those of people like Marvin Minsky and John McCarthy at MIT's own Artificial Intelligence Laboratory, was that human intelligence, with its affective, intuitive qualities, could never be duplicated by the machinery of computing — and that we tried to do so at our peril. Ten years on from Eliza, he laid out his ideas in his magnum opus, Computer Power and Human Reason, a strong push-back against the digital utopianism that dominated in many computing circles at the time.

Eliza, Part 1

Eliza, Part 2

Eliza, Part 3

(via O'Reilly Radar)

(Image: NLP Addiction)