People who do religion don’t need to study it, because they already know what it is. People who have never embraced a religion as an adult feel they can study religion objectively and still really understand it. Religious folks feel they can fairly evaluate the “truth” of other religions by reading books about them. People who have butterflied around several faiths over the years but never made a long home in one, feel they understand what drives the religious heart. Grand generalizations about religion spew easily from many of these people. But are the generalizations intelligent?
What can sneak by as “intelligent” is often over-rated. Since my youth, I’ve watched normal conversations and seen superficiality constantly disguising itself as depth. One summer, in my late teens, I read the book “EgoSpeak” which confirmed my worst skepticism. Perhaps God can rescue us from our meaningless chatter, I thought, as I dove into Christianity at that time. You see, my god, my religion and much more were colored by my temperament and my circumstance. But it didn’t take too long, as I watched my own conversations with God, until I saw the same EgoSpeak creature doing the same things.
Intelligence that can step back and understand circumstance and context is hard to culture. Most of our conversations are on a locomotive of habit — a juggernaut of trite conversation grabbers full of tricks to pretend we are listening to the other.
The New Yorker has a fun, short article reviewing a paper which criticizes the type of artificial intelligence which aims at mimicking human conversation (consider “Siri”). I agree with their conclusions that creating tricks, diversions and ploys to enable a machine to sound like an actual human interaction is failing because we are only building machines which avoid a whole class of conversations — insightful one. But I worry that the criticism can go deeper. Is normal human speech (“Ego Speak”) really something worth emulating — even when it is “insightful” or intelligent?
NPR and the New York Times recently educated me on something called “social bots” — AI which fool readers into thinking they are interacting with a real person. Apparently, they flood on-line dating services and trick people constantly. One entrepreneur bought such a service, and went into the data, finding these bots and erased them. The surprising result of cleaning up the service: membership dropped! People loved being told how cute they are, how interesting they are, how much someone adores them — even if those people keep disappearing or seeming strange after more indepth conversation. The bots keep real people coming back for more superficiality.
This month a big stress dropped from my life when I again passed my medical boards (every 6 years). So, as you may have noticed, I have more time to write now. Taking advantage of this time, I am also taking some computer courses with my son at CourseRa.org. And as I dive back into programming, and seeing this program on youtube which uses neural and genetic programming to generate try and mimic real huma soccer playing, I daydream of writing modeling software (a delusional ambition) which mimics human religious behavior. But would such a project be as disillusioning as the reading the book EgoSpeak, as learning about the deceptive power of Social Bots and as watching people theorize about religion. I think so.
Generating grand theories based on our own limited experience seems similar to EgoSpeak where we grab conversations back into our own court. We rarely deeply encounter another person and yet feel we encounter people all day long. We can’t see ourselves clearly. Is this human stupidity worth emulating on computers? What would a higher intelligence look like — perhaps one day machines will show us. Humans have feigned intelligence for a million years and created imitators to fool each other: both gods and bots. I’d say we shoot higher than imitating human.