Read The Most Human Human Online

Authors: Brian Christian

The Most Human Human (21 page)

BOOK: The Most Human Human
9.47Mb size Format: txt, pdf, ePub
ads

What would be their purpose?

1.
But what if that
is
humans’ purpose? That process of definition, the very process of
finding
a purpose? Vonnegut writes, “Tiger got to hunt, bird got to fly / Man got to sit and wonder, ‘Why, why, why?’ ” This would make the existentialists feel good, the way Aristotle’s decision that contemplation is the highest activity of man made Aristotle feel good, but in this case it would undermine their argument.

2.
“I do not challenge the statement that the most complex creature has tended to increase in elaboration through time, but I fervently deny that this limited little fact can provide an argument for general progress as a defining thrust of life’s history.” The basic argument is that while
mean
complexity has gone up,
modal
complexity hasn’t—most of the life on this planet is still, and always will be, bacterial. And because life can’t really get
simpler
than that, its fundamentally directionless proliferation of variation and diversity is mistaken for progress. In Gould’s analogy, a wildly staggering drunk will always fall off the curb into the street: not because he’s in any way driven
toward
it, but because any time he staggers the
other
way, into the buildings, he simply ricochets.

3.
I live in Seattle, which, in the wintertime, has a near-epidemic prevalence of vitamin D deficiency.

4.
I like imagining Descartes writing in his
Meditations
how he is doubting the existence of his body—and then putting down his pen and getting up to go pee and eat lunch.

5.
It’s possible that this typo streak is not simply sloppy typing but actually a deliberate attempt to make things tougher for a software sentence parser.

6.
Ludwig Wittgenstein uses the word “game” as an example in
Philosophical Investigations
of a word that can seemingly never be adequately defined.

7.
Bertrand Russell: “Unless a man has been taught what to do with success after getting it, the achievement of it must inevitably leave him a prey to boredom.”

8.
As the volume
Voice Communication Between Humans and Machines
, put together by the National Academy of Sciences, admits: “Further research effort is needed in detecting this type of ‘none of the above’ response.”

9.
(Isn’t that a little
late
? Shouldn’t the programmers have had time to deal with possible rule changes?)

10.
I wonder if part of this is a kind of “notation bias”—I use a website to keep track of the books I read and when, in case I need to go back and reference anything, and it specifies a list of “Read” books and books I’m “Currently Reading.” If instead there was simply one list, called “Books I’ve, at the Very Least, Begun,” my life might be easier.

7. Barging In

Listeners keep up with talkers; they do not wait for the end of a batch of speech and interpret it after a proportional delay, like a critic reviewing a book. And the lag between speaker’s mouth and listener’s mind is remarkably short
.

–STEVEN PINKER

Spontaneity; Flow

“Well, I mean, you know, there are different levels of difficulty, right? I mean, one obvious level of difficulty is that, you know, ‘be yourself’ would be an injunction in the first place, right, which suggests, of course, if you have to be
told
to be yourself, that you could in some way
fail
to be yourself.” Bernard Reginster, professor of philosophy at Brown University, chuckles. This tickles his philosopher’s sense of humor. “But that’s paradoxical! Because if you’re not going to be yourself, then what else are you going to be? You know? So there’s already something sort of on the face of it
peculiar
, in the idea that you should be told, or that you could be
exhorted
, or
enjoined
, to be yourself—as if you could fail!”

One of the traditional ideas, he says, about what it means to “just be yourself”—the advice and direction that the Loebner Prize organizers give the confederates each year—is to be your
true
self, that
is, “to figure out what your quote-unquote true self is supposed to be, and then [to become it] by peeling away all the layers of socialization, so to speak, and then trying to live your life in a way that would be true to that true self, so to speak.” That philosopher’s tic of putting everything in quotation marks—because to use a word is, in a way, to endorse it—tips Reginster’s hand, and paves the way for the counterargument long before it comes. “Now, the big problem with that idea,” he says, “is that a great deal of fairly recent developmental psychology and a great deal of research in psychiatry and psychoanalysis and so forth has suggested, at least, that the idea that there would be a true ‘you’ that comes into the world unaffected, unadulterated by the influence of the social environment in which you develop, is a myth. That in fact you are, as it were, socialized from the get-go. So that if you were to peel away the layers of socialization, it’s not as if what would be left over would be the true you. What would be left over would be nothing.”

Reginster echoes here Turing’s words in response to the “Lovelace Objection” that computers are incapable of “originality”: How sure are
we
that we can? They echo, also, Turing’s less confident and slightly more uneasy rhetorical question in that same 1950 paper:

The “skin of an onion” analogy is also helpful. In considering the functions of the mind or the brain we find certain operations which we can explain in purely mechanical terms. This we say does not correspond to the real mind: it is a sort of skin which we must strip off if we are to find the real mind. But then in what remains we find a further skin to be stripped off, and so on. Proceeding in this way do we ever come to the “real” mind, or do we eventually come to the skin which has nothing in it?

Without this notion of an inner-sanctum core of self, can any sense be made of the “just be yourself” advice? Reginster thinks so. “The injunction to be yourself is essentially an injunction no longer to care or worry about what other people think, what other people expect of
you, and so on and so forth, and is essentially a matter of becoming sort of unreflective or unself-conscious or spontaneous in the way in which you go about things.”

It’s interesting that the human ability to be self-aware, self-conscious, to think about one’s own actions, and indeed about one’s own thoughts, seems to be a part of our sense of unique “intelligence,” yet so many of life’s most—you name it: productive, fun, engaging, competent—moments come when we abandon such hall-of-mirrors frivolities and just, à la Nike,
do
things. I am thinking here of sex, of athletics, of the performing arts, of what we call the “zone” and what psychologists call “flow”—the state of complete immersion in an activity. When we are acting, you might very well say, “like an animal”—or even “like a machine.”

Indeed, “The ego falls away,” writes Hungarian psychologist Mihaly Csikszentmihalyi, popularizer of the psychological notion of “flow.” According to Csikszentmihalyi, there are several conditions that must be met for flow to happen. One of these, he says, is “immediate feedback.”

Long Distance

At Christmas this past year, my aunt’s cell phone rings and it’s my uncle, calling from Iraq. He’s in the Marine reserves, on his second tour of duty. As the phone makes the rounds of the family members, I keep thinking how incredible and amazing technology is—he is
calling
us,
live
, from a
war
, to wish us Merry Christmas—how technology changes the dynamics of soldier-family intimacy! In the days of letter writing, communication was batch-like, with awkward waits; now we are put
directly
in contact and that awkward waiting and turn-taking is
gone
and we can really
talk

The phone comes to me and I exclaim, “Hi! Merry Christmas!”

Silence.

It jars me, my enthusiasm met with seemingly no reaction, and I become self-conscious—am I perhaps not so high on his list of family
members he’s excited to talk to? Then, a beat later, he finally comes out with his own, albeit slightly less effusive “Merry Christmas!” Thrown off, I fumble, “It’s great to be able to talk to you when you’re all the way over there.”

Again silence. No response. Suddenly nervous and uncomfortable, I think, “Didn’t we have more rapport than this?” Everything I want to say or ask suddenly feels trivial, inconsequential, labored. Like a comedian left hanging without a laugh at the end of a joke—it takes mere tenths of a second—I feel that I’m floundering, that I’m wasting his time. I’m wasting his time
during a war
. I need to hand the phone off pronto. So when he finally replies, “Yeah, it’s great to be able to talk to
you
when
you’re
all the way over
there,
” I mumble a “Well, I won’t hold you up—oh, here’s so-and-so! Talk to you soon!” and awkwardly hand it away.

Answering Porously

A few months later I’m doing a phone interview for a group of booksellers in some of this book’s very early-stage PR. The questions are straightforward enough, and I’m not having any trouble coming up with the answers, but what I find myself struggling with is the
length
of the answers: with something as complex as a book, everything has a very short, sound-bite answer, a short, anecdotal answer, a long, considered answer, and a very long, comprehensive answer. I have these conversations all the time, and for the most part I have two main ways of making the answers “site-specific.” One is to watch the listener’s face for signs of interest or disinterest and adjust accordingly; the other is to make the answer
porous
, to leave tiny pauses, where the listener can either jump in, or redirect, or just let me keep going. With my barista, I begin with the sound-bite answer and happily get eschatological with her as she jumps in and tells me with a half smirk that the “machines” can “bring it” and that she’s “totally prepared to eat [her] cats” in any kind of siege scenario. With some of my more academic-leaning acquaintances, I watch them looking quizzical and
concentrated and not much inclined to interject anything until I reel out the full story, with all its nuances and qualifiers in place.

On the phone with the booksellers I of course can’t see their faces; in fact I don’t even know how many people “they” are on the other end. When I proffer those “quarter-note rests” to prompt either the expectant “huh’s” and “yeah’s” that spur a tale on, or the contented ones that wrap it up, I hear nothing. If I stretch it to a “half-note rest,” they assume I’m done and ask me a new question. I try splitting the difference; then we both jump back in at the same time. A guy can’t catch a break—or more accurately might be he can’t get someone
else
to catch
his
breaks. Somehow the timing ballet that feels like second nature in person seems consistently—here, and as a general rule—to break down over the phone. I do the best I can, but it feels, somehow,
solitary

Computability Theory vs. Complexity Theory

The first branch of computer science theory was what’s come to be known as “computability theory,” a field that concerns itself with theoretical models of computing machines and the theoretical limits of their power. It’s this branch of theory in which Turing made some of his greatest contributions: in the 1930s and ’40s, physical computing machines were so fledgling that it made sense to think idealistically about them and the purely theoretical extents and limits of their potential.

Ignoring the gap between theory and practice has its drawbacks, of course. As Dave Ackley writes, “Computability theory doesn’t care a whit how long a computation would take, only whether it’s possible or not … Take a millisecond or take a millennium, it’s all the same to computability theory.”

Computer scientists refer to certain problems as “intractable”—meaning the correct answer can be computed, but not quickly enough to be of use. Intractable problems blur the line between what computers “can” and “cannot” do. For instance, a magic, oracular machine
that can predict the future—yet works slower than real time—is a machine which, quite literally, can
not
predict the future.
1

As it turns out, however, intractability has its uses. Combination locks, for instance, are not
impossible
to open: you can just try every combination until you hit upon the correct one. Rather, they’re
intractable
, because the time it would take you to do that would get you caught and/or simply not be worth whatever was behind the lock. Similarly, computer data encryption hinges on the fact that prime numbers can be multiplied into large composite numbers faster than composite numbers can be factored back into their primes. The two operations are both perfectly computable, but the second happens to be exponentially slower—making it intractable. This is what makes online security, and online commerce, possible.

The next generation of computer theorists after Turing, in the 1960s and ’70s, began to develop a branch of the discipline, called complexity theory, that took such time-and-space constraints into account. As computer theorist Hava Siegelmann of the University of Massachusetts explains, this more “modern” theory deals not only “with the ultimate power of a machine, but also with its expressive power under constraints on resources, such as time and space.”

Michael Sipser’s textbook
Introduction to the Theory of Computation
, considered one of the bibles of theoretical computer science, and the textbook I myself used in college, cautions, “Even when a problem is decidable and thus computationally solvable in principle,
it may not be solvable in practice if the solution requires an inordinate amount of time or memory.” Still, this is the introduction to the book’s
final
section, which my senior-year theory course only touched on briefly in the semester’s final weeks.

BOOK: The Most Human Human
9.47Mb size Format: txt, pdf, ePub
ads

Other books

Saving Grace by Anita Cox
Shattered by Eric Walters
Jennifer's Surrender by Jake, Olivia
The Bullpen Gospels by Dirk Hayhurst
Love’s Bounty by Nina Pierce
Turn Back the Dawn by Nell Kincaid
Hope for Us (Hope Series Book #3) by Michelle, Sydney Aaliyah
The Cirque by Ryann Kerekes
Releasing the Wolf by Dianna Hardy