By David Foster Wallace ’85. New York: W.W. Norton and Company, 2003. 320 pp. $23.95 hardcover.

Reviewed by Alexander George

The voyeurism of our age extends naturally to matters of the mind. We want to see what is going on under the sheets of science and, if possible, be shown a keyhole to the boudoir of mathematics. We don’t actually want to participate ourselves, to get down and dirty with the researchers; we just want to be titillated by watching from a safe distance what they are up to. This desire has given rise to an entire mini-industry: books, magazines, television programs and a veritable army of suitably trained reporters, writers, producers and other support personnel. Norton’s Great Discoveries series, which Everything and More kicks off, is part of this trend. Its leading gimmick is to pair significant intellectual achievements with well-known contemporary writers in the hope that sparks (and sales) will result. In this case, a tremendously talented writer of fiction and essays takes on the multi-millennial history of attempts to domesticate the infinite, from Zeno’s paradoxes through the development of the calculus to the emergence of modern set theory.

I am skeptical of the general enterprise. Not on any prurient grounds, but because (unlike an honest peepshow) it promises more than it can possibly deliver. It seduces with the prospect of understanding, but what it ultimately provides is little more than the illusion of insight. Understanding of achievements in high-level science and mathematics cannot be had on the cheap, and so-called “general understanding” is often merely a code word for the grasp of some pictures and metaphors and the quasi-incorporation of a few strange words into one’s vocabulary.

It is to Wallace’s credit that he knows all this. While affirming that his book is a piece of “pop technical writing” (in that it seeks to survey a substantial amount of material for the benefit of an audience without much preparation), Wallace is determined not to make it “fuzzy or Newsweekish.” He intends to steer clear of the Scylla of sensationalism by keeping his focus on the mathematical ideas themselves (as opposed to the lives or general nuttiness of any particular cast of characters). Of course, as he says, there’s a hitch, namely that the enterprise breaks up on the Charybdis of complexity, that readers with no preparation may find it impossible to understand the ideas so presented.

Wallace’s determination makes his book a glorious failure. Its glory consists in a passion for the ideas he is trying to convey, a passion that leads him to shun the usual journalistic pap served up in popularizations in favor of a depth, breadth and complexity that, together with his scintillating style, make this work a tour de force. It’s a failure for precisely the same reasons: the work weaves together so many subtle discussions of so many abstract problems, confusions and achievements that no one without a good background in mathematics could comfortably follow. (None of this is helped by the absence of a table of contents, section headings and an index.) In fact, the reader who will most appreciate this book may well be the one who is already conversant with these ideas but has never before seen them knitted together in such an intelligent and humorous narrative. (Though said reader may also be the one to snag on the errors scattered throughout the text.)

That’s not to say that general readers won’t be entertained. Wallace’s full-throttle style will see to that: it is a quirky voice, at once very conversational and highly learned, laden both with slang and elegantly complex constructions, and replete with earthy vernacularisms, wonderfully rare words and plenty of hilarious neologisms (my favorite: epistoschizoid). (Some of these devices wear thin after a while; e.g., the “X-thing” locution, as in “the whole algebraic-vs.-transcendental-irrational thing,” begins to grate after the nth iteration. As does Wallace’s general self-positioning as the outsider who eschews the “abstract math-class vomitus” and instead gives the skinny on what “almost nobody ever tells undergraduates.”) The humor extends even to the structure of the book, which comes to be controlled by an increasingly elaborate, frenzied and hopeless apparatus of digression.

The reader will not be merely entertained, but electrified by the book’s audacious and gripping chronicle of mankind’s long struggle with the infinite. Readers might not come away with an understanding of just how work on the uniqueness of representations of functions by trigonometric series led Cantor to his theory of the infinite. But they will appreciate something at least as important and amazing: that the several-thousand-year-old pursuit of the infinite—the infinite!—has essentially been one of the longest-running cross-cultural projects of all time. Now that, to use one of Wallace’s favorite words, is weird.

George is a Professor of Philosophy