Friday, December 28, 2012

R.I.P. 2012


Among the authors we lost in 2012: Czech novelist Josef Skvorecky.  American science fiction writer Ray Bradbury.  Poet and painter Dorothea Tanning.  Novelist, playwright and public intellectual Gore Vidal. Polish poet and 1996 Nobel Prize for Literature winner Wislawa Szymborska. Historical novelist Barry Unsworth.  Cultural writer Jacques Barzun.  Film critic Andrew Sarris.

Other writers not pictured: novelist Carlos Fuentes, Nora Ephron, ecologist Barry Commoner, Ernest Callenbach, Paul Fussell, art critic Robert Hughes (Shock of the New), Maurice Sendak, poet Adrienne Rich, poet Daryl Hine, science fiction writers Harry Harrison and K.D. Wentworth, novelist Harry Crews, poet Irene McKinney, psychologist Daniel Stern, music critic Charles Rosen, poet Jack Gilbert, childrens book author Jean Merrill, James Q. Wilson, novelist Rosa Guy, journalist Alexander Cockburn, Russell Means, art critic Hilton Kramer, British novelist James Riordan, spy novelist Dorothy Gilman, poet Reed Whittemore, Earl Shorris, historian Peter Connolly, childrens book author Jan Berenstain, William Hamilton, Steven Covey, Jeffrey Zaslow.  Barney Rosset, founder of Grove Press.  May they rest in peace.  Their work lives on.

Among these are authors I've read to my great profit.  But I need to recognize a few books in particular that were important to me.  One of them is Robert Hughes Shock of the New, which along with his PBS series, opened my eyes and mind to modern art.  I heard him speak once as well, and he was riveting. 


Another book that opened my mind was Paul Fussell's Class.  Even given his generally droll treatment, the idea that America even had classes was heretical--and I blush to admit, new to me.















I began reading Ray Bradbury at a fairly young age--pretty directly after the science fiction of the Winston series meant for young readers.  It was later on that I came to appreciate this novel, and then its brilliant screen version by Truffaut.  It gains in importance over time.


Monday, December 17, 2012

Gifts of Art



Rene Magritte: Newly Discovered Works
Edited by Sarah Whitfield
Yale Press

This is the season for coffee table art books, which have apparently outlived coffee tables.  Rene Magritte is one of a handful of modern artists who also fascinates a popular audience with his playful juxtapositions and enigmatic images.  Over the past couple of decades a number of works have surfaced purporting to be Magritte's.  This volume publishes color images of many that have been authenticated as true Magrittes.

For the Magritte aficionado, the imagery of many will be familiar--not only did Magritte often mix and match his imagery, but he created exactly (or almost exactly) the same imagery in different media (here it seems to be most often gouache versions of oil paintings.)  Still, there are enough new variations and several spectacular new images to delight even the casual admirer of Magritte's work.  This is a handsome volume indeed, with some images filling a large page.  The scholarly notes are precise but don't overwhelm the images.  Like many famous Magritte paintings, they delight--they make you smile, but also often with that air of mystery and melancholy.  I've always liked his skies, which seem to be between night and day.  Dawn or dusk--or both.  That's Magritte.

Here are three more of my favorites from Yale:


Ezra Stoller: Photographer
Edited by Nina Rappaport and Erica Stoller

Ezra Stoller's photographs of the work of modern architects helped make them famous.  This volume of his mostly black-and-white photography from the 1940s to the 1970s includes these photos, but also photos of other characteristically modern exteriors and interiors--everything from inside the United Nations to inside a Sears store.  The photos often concentrate on what's most interesting about a building: the play of light and shadow, for instance, or its particular place in the landscape (a small modern house at the top of a formidable rocky hill.)  A few of the interiors include people, which at this point help to place them in time, but most of the images are recognizably from the modern age--the pre-post-modern age, that is.

The selection and arrangement of images is excellent for browsing as well as study, and there are several introductory essays on the biographical, historical and formal background.  "My father was a storyteller," Erica Stoller writes, and this volume is like a collection of visual short stories.  It should please readers beyond those involved in either architecture or photography, but will be of special interest to them.

 Ancestral Modern
by Pamela McClusky, Wally Caruana, Lisa Graziose Corrin, and Stephen Gilchrist
 
Artists and others have long been fascinated with the imagery of Australian aboriginal art, both ancient and contemporary.  But beyond their interest as abstract images, there is often a specific relationship of the images to actual landscape, animals and historical events.  This volume matches images with the aspects of the real world that inspired them.  It should appeal to the wandering eye, open to color and pattern and to their relationship to nature, but there is also the scholarly support by two Australian and two American curators that adds another level of satisfaction as well as a permanent contribution to knowledge.


Roy Lichtenstein
James Rondeau and Sheena Wagstaff; With contributions by Clare Bell, Yve-Alain Bois, Iria Candela, Harry Cooper, Sara Doris, Chrissie Iles, James Lawrence, and Stephen Little.

Lichtenstein's images also combine popular fascination with artistic stature.  This volume expands on the familiar with a more complete view of the artist's work, plus previously obscure drawings and collages.  The contributors place Lichtenstein in art history and his own time, while the wealth of images (some 130 works) presented in this large format may fascinate and delight on their own.  

Wednesday, December 12, 2012


The Rise of Nuclear Fear
By Spencer R. Weart
Harvard University Press

For readers born since 1990, nuclear fear may be a concept hard to access. Thermonuclear war is something out of cheesy old movies or video games, and nuclear power is a running joke on The Simpsons. Even older generations that saw those movies without irony— on any given day nuclear apocalypse was possibly just around the corner—don’t think about it much anymore. Even though the U.S. and Russia have enough hydrogen bombs still pointed at each other to set civilization back for decades.

But as Spencer Weart writes, “Sometimes the most powerful things in our heads are the ones we don’t pay attention to.” It takes just a few new events to resurrect fearful images: the Fukushima reactor disaster in Japan, the specter of a mushroom cloud over an American city from a terrorist bomb from Iraq or Iran. Those nightmarish images were created over many decades, and Weart produces a history of that imagery.

At first it was extremely positive: the discovery of radium and early theories of atomic potential led some scientists and journalists to proclaim unlimited power for almost no cost, leading to the rich and gleaming White City of the future. But even in the early 20th century the atom’s equally immense potential for destructiveness was the subject of warnings and science fiction tales of total devastation: the Empty City.

  Then in 1945 came the reality. “With the news from Hiroshima sensitive thinkers quickly realized that doomsday was no longer just a religious or science-fiction myth, but as real a part of the possible future as tomorrow’s breakfast.” Weart chronicles the oscillating hopes for utopia and fears of oblivion, and the highly instructive repressions, denial and numbing during the most obviously threatened decades from the 50s through the 80s. He does so in even greater detail (and with better sourcing) in his earlier book (titled simply Nuclear Fear) but in exploring what he calls the post-1990 Second Nuclear Age  in this book, he demonstrates how this imagery echoes—for example, in the pictures of the Twin Towers falling and the devastation around Ground Zero, with their powerful reminders of the mushroom cloud and the Empty City of Hiroshima.

Weart also links nuclear imagery with more ancient and archetypal images, particularly of alchemy (as did some early atomic scientists.) The failure of the climate crisis to inspire motivating imagery, he believes, is because it had not yet found that mythological depth.   

  While Weart suggests that nuclear fear may have helped prevent actual nuclear war, he sees more virtue in disengaging the imagery from the realities, especially when it comes to nuclear power versus fossil fuel pollution—particularly from coal-- that has already killed and sickened millions.

As Weart shows by his approach and this book's content, the response to atomic power involves facts and feelings, in many different combinations.  (Sometimes when the public felt they were being lied to, they were right.) Weart is frequently perceptive in both the more metaphorical areas and in the relationship of imagery to fact in both of the nuclear arenas--as well as in their relationship to each other. His writing voice is clear and with enough personality to carry reader interest and confidence.  Because he explores the topic to the depth required for understanding it--and our times--this is an important book.  Together with Nuclear Fear, and a few other books like DeGroot's The Bomb and  Lindquvist's A History of Bombing, it is indispensable to that understanding.         

Wednesday, December 05, 2012


2312: A Novel
by Kim Stanley Robinson
Orbit

It’s 300 years in the future and planet Earth is predictably hot—“almost an ice-free planet,” with swaths of the northern hemisphere as hot as the equator now, and the oceans 11 meters higher. Florida is underwater and Manhattan is flooded but inhabited: the 24th century Venice. An attempted technological solution (to block sunlight) only led to faster catastrophe.

But humanity has also spread out to the farthest reaches of the solar system, establishing radically new civilizations on planets and moons, and inside hollowed-out asteroids. While Earth is ruled by “powerful nation-states that were also corporate conglomerates,” the outside worlds (great and small) are teeming with experiment and variety, fiercely independent and ingenious but beginning to cooperate in a new kind of economy. Humans are physically changing in response to their new habitations’ gravity and distance from the sun, as well as some genetic manipulation.

  Kim Stanley Robinson fans will recognize features of his other space-based tales (longevity treatments, other worlds as utopian experiments, artificial intelligence and its complications, even his previous vision of Mercury and to some extent of Mars) and science fiction fans will note echoes from Arthur C. Clarke (his “space elevator” technology) and George Zebrowski’s ground-breaking 1979 Macrolife (colonies inside asteroids.) But with a few new technological wrinkles he creates a plausible and expansive future that includes struggle and adventure.

  Sure, his vision of new worlds of great variety and culture, replicating past Terran civilizations and tenderly nurturing animal species extinct on Earth, may tend towards the idealistic or at least hopeful. It seems just as possible that they would replicate suburbia. But ripples of plausible hope are what Kim Stanley Robinson brings to all the planets on the table.

Part of the plausibility is the story, full of conflict and conspiracy, centered on a mystery. Part of the pleasure of reading it is the central if unlikely love story, and Robinson’s unique feel for the natural world (or worlds) and of humans in the landscape. There is also a central revolutionary act to revive Earth that I won’t give away, but it’s unique in sci-fi, and uniquely KSR.

Robinson’s singular championing of the primal and the natural world does not prevent a hope for grand technologies that “could at last begin to overturn Jevons Paradox, which states that the better human technology gets, the more harm we do with it.” This is a long (561 pages) and richly imagined book-- not nearly as long as his Mars trilogy, but with some of the same qualities. With a literary background (including Theory), Robinson at the beginning of his career made a deliberate choice to bring literary intentions and quality to the genre of science fiction. This is yet another proof of that experiment’s ongoing success.

Friday, November 30, 2012

Net Smarties


Net Smart: How to Thrive Online
By Howard Rheingold
The MIT Press

Howard Rheingold has a white moustache. He dresses like the prototype of an aging hippie, but he’s been involved in personal computer theory and practice for 30 years. His social network experience goes back to one of the earliest example, San Francisco’s legendary The Well. Younger readers may know him as the author of Smart Mobs: The Next Social Revolution. So he’s positioned to speak to several generations on “how to thrive online,” including boomers.

  But this is not a simple how-to book, full of bullet points and illustrations (though it does include some cartoons reminiscent of R. Crumb.) Rheingold writes about how to use the net “mindfully,” which includes critically. Two of his five web literacies (along with attention, participation and collaboration) are net smarts and crap detection. Still, his stance is mainly positive on the benefits and the social change inherent in what he calls the shift from group to network.

  His approach is historical as well as theoretical, philosophical and practical. He pays attention to how adventures in cyberspace change individuals as well as society, and he considers several sides of those questions. He can be acerbic as well as inspirational.

  One of his main assertions is that “the emerging digital divide is between those who know how to use social media for individual advantage and collective action, and those who do not.” Originally the “digital divide” meant the distance between the individuals and communities who could financially afford to fully participate in cyberspace and those who couldn’t. That divide still exists, especially as full participation seems to require frequent new purchases of the latest devices.

  The author’s personal experience—as well as from his students and children—provides rare perspective on cyberspace. This is also a kind of guidebook to future practice, as Rheingold emphasizes the ethics of good networking and social collaboration. So part of his advice, in addition to how to manage information overload and survive on Facebook, is how to be a good participator and network citizen. Plus his emphasis on the skills of attention and critical thinking are useful beyond cyberspace.

    The book is pretty clearly written but it’s not a fast read because there is at least one interesting idea on every page, and depending on individual net experience, a number of challenging topics. It’s also meant to be read beginning to end, which is not exactly the Internet way. It’s about 250 pages of text plus lots of notes and a mediocre index, but what’s missing is a sorely needed glossary of terms as they are used in the book. Overall it’s that paradox of the instant-obsolescence computer age: a keeper.
                                                                    *

There are a few other books about these technologies and the real world for the computer literate on your shopping list this year.  Computing: A Concise History by Paul E. Ceruzzi (MIT) is exactly that: a clearly written history of computing, complete with names and places--and this one does have a glossary.  At 150 plus pages it's very handy, but still a solid and serious work.

Future Perfect: The Case For Progress In A Networked Age by Steven Johnson (Riverhead Books) is breezily written, with a point of view that's basically similar to Rheingold's, but of course it's a different book, with different perspectives, examples and elaborations. 

The Technology of Nonviolence: Social Media and Violence Prevention by Joseph G. Bock (MIT Press) is even more specific.  It is cutting-edge and global, with chapters on Countering Ethnoreligious Violence in Sri Lanka, Interrupting Gang Violence in Chicago and Crowdsourcing during Post-election Violence in Kenya.  It begins with theoretical frameworks and ends with evaluations and recommendations.  This is a thoughtful, sometimes provocative but well-presented and serious work, especially for those grappling with such issues in government, NGOs and even the corporate world, and for students interested in these fields. 
 

Friday, November 23, 2012

Where the Heart Beats: John Cage, Zen Buddhism and the Inner Life of Artists
by Kay Larson
Penguin Press

Kay Larson makes a convincing case that, after gathering particular artistic and philosophical strands together from others, John Cage added his own insights and innovations to become a dominant influence in 20th century arts. Those insights were primarily from his understanding of Zen Buddhism, and it is Larson’s singular contribution that as a Zen practitioner, she can elucidate them.

Larson was a distinguished New York art critic before she began Zen studies with John Daido Loori (whose book Cave of Tigers I particularly admire.) This unique view of John Cage is organized by portraits of other artists—those who influenced Cage and those he influenced—as well as Cage’s own biography. As the subtitle suggests, the emphasis is on their inner lives.

Among Cage’s early influences was the artist Morris Graves. They met in 1938 after a Cage percussion concert in Seattle that Graves disrupted, to Cage’s delight. Eventually they shared a house. Graves, who had traveled in Asia, introduced him to Zen, though flavored with his wild Dadaist demeanor. Then again, as Larson points out, “Zen wit” has always been a draw.

Cage’s convictions were eventually consolidated by attending classes at Columbia University in the 1950s with D.T. Suzuki, the first Japanese expert to bring Zen to America. Cage refined his artistic credo: “To use art not as self-expression but as self-alteration. To become more open.” This was the basis for his techniques of chance operations. According to Cage, indeterminacy short-circuits pre-judgment, so it pushes open the doors of the mind to the discrete reality of what’s present. It can also suggest new ways in which sounds or the 90 stories Cage assembled by chance are related—complex relationships emerge.

  Larson makes good use of Cage’s writings, which were themselves (I think) even more widely influential than his music, and in some ways an essential part of the music.  The sense of liberation those writings gave those who didn't have the opportunity to experience his lectures and performances, was inestimable, as Larson demonstrates.  For me, it was not that Cage's own aesthetic was the absolute alternative, but that there were no absolute alternates:

They ask what/the purpose of art is.  Is that how/things are?  Say there were a thousand/ artists and one purpose, would one/artist be having it and all the nine hundred and/ninety-nine others be/missing the point?

Larson's book is full of the creative figures of 20th century art and music who felt that liberation and who were part of living it for others, from the fascinating but relatively obscure to the well-known, including Jasper Johns, Robert Rauschenberg, dancer Merce Cunningham, poet John Ashbery and Yoko Ono. Larson also writes about the impact of Zen on American arts in general, with Gary Snyder and the Beats. Together they explore an important substratum of particularly American culture.

John Cage (who died in 1992) was a presence. As a student I met him several times, heard him lecture and saw the Merce Cunningham dancers perform some of their collaborations. Several times Larson emphasizes the integrity of his presence, and his persistent open smile. That’s how I remember him. He embodied his world view more completely and serenely than anyone else I’ve encountered—apart from Buddhist monks.

Sunday, September 16, 2012


John F. Kennedy
By Alan Brinkley
Times Books

I turned 14 the summer Senator John F. Kennedy was nominated for President and I participated in my first political campaign. I had relatives I could visit in Washington to see the Inaugural parade in January, and by luck and pluck that weekend I was one of the first non-dignitaries to shake hands with the new President. So it isn’t surprising that I read the major books (and some minor ones) on JFK through the 1960s.  Then came the revisionists, the debunkers, the gossip-mongers, and I mostly passed on those, while being aware of their main points.

So I came to this relatively slim volume for a balanced view by a distinguished historian and biographer. It’s part of The American Presidents Series, co-edited by Arthur Schlesinger, Jr., the historian who was a White House aide to JFK and wrote his own long book on this presidency.  I assume he wouldn’t let serious errors pass.

Yes, Brinkley writes, JFK’s health was more precarious, and he was more of a womanizer than was known at the time. Brinkley handles such revisions fairly, though overall he seems so impatiently determined to demythologize JFK that it becomes harder to understand how Kennedy had such an impact on his times, or why he was so popular as President (and remains so.)  But mostly Brinkley’s narrative is factual. It lacks the detail to bring events to life, but most of the time it generalizes judiciously with the perspective of time and newer evidence.

It will probably amaze younger readers that so much of such importance and drama happened in such a brief presidency, including several foreign crises culminating in the Cuban Missile Crisis, as well as economic, political and civil rights struggles at home. I caught only one implied factual error, but otherwise Brinkley’s interpretations of events may not be mine but they are at least plausible. (I’d certainly place more emphasis on JFK’s American University speech and subsequent championing of the limited nuclear test ban treaty.)

  On the question of whether JFK would have extracted the modest U.S. presence from Vietnam had he lived to be reelected, Brinkley is doubtful. But he doesn’t explain why he disagrees with the evidence to the contrary collected by government scholar James K. Galbraith, or the statements by Kennedy aides, including Schlesinger, that JFK was indeed planning to withdraw. Brinkley dismisses those who see JFK idealistically, oddly lumping them together with conspiracy theorists.

 Still, as a brief primer on the Kennedy presidency this book succeeds pretty well. I wouldn’t stop here, though. For all their omissions, longer and earlier books by the likes of Schlesinger and Ted Sorensen still give a fuller picture of JFK and his times.

Sunday, September 02, 2012

 
Orderly and Humane: The Expulsion of the Germans after the Second World War
By R.M. Douglas
Yale University Press

1940s movies and domestic propaganda put an understandably noble face on Allied efforts in World War II. It was only in the decades afterwards that some of the morally questionable and transparently brutal counter-evidence emerged: not just the Holocaust and Japanese Army atrocities but the British firebombing of undefended Dresden, the American saturation terror bombing of Tokyo, the deadly irradiating of Nagasaki, to name the most prominent . By the 1960s, war novels by American veterans were revealing unflattering memories.  Revelations continue--in the novel By Blood, the previous book reviewed here, for instance, the reality that even after the war, even after some German concentration camps were liberated, Jews continued to live in them, because no other country would take them.   

But in today’s soundbite history, World War II has reverted to the purity of The Last Good War fought by The Greatest Generation. The truth as it still emerges is a great deal more complicated for what one historian calls the number one “multicide” in human history, with a minimum of 66 million dead, at least half of them civilians. 

And the death toll did not end with the end of the war. Historian R.M. Douglas writes in compelling detail about a little known set of events: at war’s end the Allies ran “the largest forced population transfer—and perhaps the greatest single movement of peoples—in human history.” Some 12 to 14 million German-speaking civilians, most of them women and children, were extracted from their homes in eastern Europe and sent to Germany “amidst the ruins of the Reich, to fend for themselves as best they could.”

Some were transported in locked train freight cars, scenes that we know from Nazi treatment of Jews, and some were held in concentration camps, including Auschwitz. Starvation, disease and abuse resulted in half a million to a million deaths. Douglas explicitly states there is no legitimate comparison or equivalence to the Holocaust, but some of the Holocaust’s victims saw a moral parallel. An Auschwitz survivor viewed with shame the silence that surrounded these events. “We used to console ourselves by saying ‘only the Germans are capable of such things.’”

German speakers were expelled and their property confiscated largely at the behest of host countries (Czechoslovakia, Hungary and Poland) because they wouldn’t be tolerated by their majority populations, though no evidence was offered for this claim. The massive transfers—or what more recently would be called ethnic cleansings--quickly turned into bureaucratic and international nightmares.

 Douglas relies on verifiable sources to describe these expulsions within the context of the relevant and tangled history of Europe, especially the legacies of World War I. They bear directly on the 1990s ethnic violence in the Balkans, and aspects of this story are way too reminiscent of the recent Iraq occupation. This is an absorbing and eye-opening account, an educational astringent and an antidote to the oversimplified and self-satisfied cliches about World War II that constitute the current conventional wisdom.

Monday, August 06, 2012

By Blood--Quality Summer Read


By Blood: A Novel
by Ellen Ullman
Farar, Straus and Giroux

In 1974 San Francisco, a professor on leave rents an office in an old building to write, but finds himself separated by a paper-thin wall from a working psychiatrist. Most of her sessions are masked by a white noise machine, but for one client she turns it off. That client’s plight becomes his obsession.

So there are two stories in this absorbing literary novel. Out front is an adopted young woman’s conflicted life that becomes centered on a search for her birth mother, as she describes it to her psychiatrist. But creating its own tension is the mystery of the professor who is narrating the story. While he seems a reliable (and eloquent) reporter, his own back story—doled out incompletely in a series of hints and asides as well as a few events—only adds to the creepiness of his insistent eavesdropping. It’s the era of the Zodiac killer, and we wonder who is this guy, and why is his university investigating him? On the other hand, we’re eavesdropping right along with him.

  This narrative device maintains tension, but the substance of the woman’s story is also dramatic. Without giving away too much, it’s important to note that in the early 70s, a young woman during World War II would be in her 50s. The professor asks the central questions that grip him and the patient: “Does it matter who your father is? Your mother? Who are the exact people who dropped their blood into the container that is you?” As the patient staggers through identities—from her upper middle class Protestant upbringing to the possibility that she was born a Catholic (which her father despises) or a Jew (which he despises even more) born in Europe, and then specifically in Germany—the professor listens, and even finds a way to assist in the search.

By the end the reader learns a lot that historical soundbites leave out, and the central questions are explored in several dimensions. The professor-narrator is clearly looking for some redemption in witnessing and aiding the unseen patient’s journey. Whether he achieves it may require the reader to go back to the beginning and start reading the book all over again.

The narrator’s descriptions of 1970s San Francisco (as a man in a Lesbian bar, and in the Castro) are vivid in themselves. The story is as gripping as in a mystery, and the quality of writing as well as the ambiguities and layers of meaning define this as a literary novel in the best sense. Any seeming familiarity in the terrain is deflected by the very specific and well-rendered characters and events. Though it's been out for a few months, for now it’s a summer read of quality.

Saturday, July 21, 2012

For Pleasure: Summer


The pleasure of the summer so far is Nicholson Baker's novel The Anthologist (Simon & Schuster), which I picked up at--of all places--the Dollar Tree.  It helps that I'm interested in what a friend used to call "the poetry biz," although I haven't followed it since those days I was on the fringe of it, in the 1970s. (On the other hand one of the poets named, Ed Ochester, did hire me to teach a course in 1990 or so.)   Part of the book's charm for me is that while not a lot happens, Baker's novelistic skill makes a lot out of only a few dramatic questions: will our hero actually ever write his anthology's introduction?  Will he get back with his lady love, or find a new one (maybe next door)?   Baker is such a fine and funny writer that this book is delightful.  There is real human feeling in it, too (what writer can't identify with those tears at dreams unfulfilled) and his exegesis on the music of poetry makes a lot of sense. It probably also helps that I read part of it on a trip down the coast to Menlo Park, in sunny 80 degree afternoons and a morning outside at Cafe Borrone.  Pleasure doubled.

It seems summer wouldn't be the same without at least one 700 page book, and this year's is The Seven Basic Plots by Christopher Booker (Continuum International Publishing Group, paperback.)  You would think a book of that title would be slim and handy, or at least of a length easily amenable to writing classes and groups.  Indeed, Booker does get through his basic discussion of his seven plots (Overcoming the Monster, Rags to Riches, The Quest, Voyage and Return, Comedy, Tragedy and Rebirth) in about 200 pages, which is about as far as I've read so far.  But I'm not worried--the book is so cogently written, without jargon or "theory," and with frequent revelations that I have full confidence in the author to continue making this an enjoyable and illuminating reading experience, regardless of where I might agree or differ with his points.  I am in fact in awe of his disciplined yet almost conversational writing.  It was 34 years in the making (1969-2003), and 34 years well spent.  I look forward to the remaining 500 pages.

Other books I've read or am reading this summer not for review or for a direct writing purpose: Working the Soul: Reflections on Jungian Psychology by Charles Ponce (North Atlantic Books), Your Favorite Seuss (a "baker's dozen by the one and only Dr. Seuss," published by Random House) and Bookless in Baghdad: Reflections on Writing and Writers by Shashi Tharoor (Arcade Publishing, and another Dollar Tree acquisition.)

Earler, in the spring, I read and very much enjoyed Chronic City, a novel by Jonathan Lethem (Faber & Faber.) 

Saturday, July 14, 2012

E. O. Wilson and Social Evolution


The Social Conquest of Earth
Edward O. Wilson
Liveright

In examining the relationship of humans to the rest of nature as well as human nature itself, Edward O. Wilson has been a defining presence for some forty years. He created the field of sociobiology in the 1970s. He won the Pulitzer Prize for On Human Nature in 1979, and with later titles (Consilience: The Unity of Knowledge, and Biofilia) he added concepts (and buzzwords) to scientific and popular discussion.

His work has always been controversial, and assertions in this new book similarly attracted criticism from major figures. His rejection of “kin selection” in human evolution (a concept used to explain altruism that he previously championed) and his adoption of a form of “group selection” have received lengthy rejoinders from psychologist Steven Pinker and evolutionary biologist Richard Dawkins (Dawkins also cites strong disagreement from scores of other scientists.)

  It’s not possible to even summarize these arguments briefly. But I can instead discuss what contending scientists largely do not: this book as a reading experience. Whatever the results of scientific debate, the central theme of this book is a bracing corrective to the run of evolutionary theory since Dawkin’s selfish gene thesis began to dominate. Wilson looks at human evolution, not as isolated every-individual-for-itself battles of one against all (let alone every gene for itself), but by recognizing the reality of humans as social beings. He has a word for natural selection that responds to the social context as well as the rest of the environment: eusociality.


In under 300 pages, Wilson discusses the latest of what’s known about biological and cultural human evolution, and focuses his theories on the origins of morality, religion and the creative arts. Lynn Margulis noted that previous evolutionary theorists failed to account for the role of symbiosis because they studied big animals, not the bacteria and tiny organisms she studied. Wilson uses his knowledge of social insects (his first area of expertise was ants) to challenge prevailing assumptions that might be equally blinkered. Recent science in animal intelligence as well as a more modest view of genes themselves argue for greater complexity than the previously dominant mechanistic theories allowed. So a different approach like Wilson’s fills a need.

Some of his scientific arguments get technical. Some chapters seem more cogent that others (I found “What is Human Nature?” disappointing.)  Clearly his assertions are debatable.  But mainly Wilson’s prose is clear, informative and at times provocative, though at times its directness risks sounding dogmatic. Readers might best approach it as expressing a scientific point of view that is not always as settled as the writing might indicate.

This can also be read as a kind of updated summary work, a set of big picture conclusions, by an elder scientist (similar for example to Jerome Kagan's latest in that regard.)  However controversial, he presents a refreshing and even hopeful synthesis, arguing that knowing our biological history is essential to saving our magnificently unlikely species at “the growing point of an unfinished epic,” as well as the rest of present life we have imperiled.

Monday, July 02, 2012

Frayn's Novel Farce

Skios
By Michael Frayn
Metropolitan Books

On the stage, farce is about running in and out of doors, concealment and revelation, expectation and illusion, pretence and persuasion, need and want. Michael Frayn, who wrote what many regard as the best stage farce of the age (Noises Off) wondered if he could write farce as a novel, and this book is the convincing—and consequently very funny—result.

How could a younger, handsomer and utterly feckless guy successfully impersonate a staid expert guest speaker at the annual gala of an international foundation dedicated to preserving western civilization, held on the private Greek island of Skios? A lot of coincidence helps, but much aid comes from very contemporary examples of human nature. Rich, powerful and educated, the gulled audience nevertheless is a willing accomplice. When the imposter suggests he could easily be someone else, they happily agree. “We’re all such fools!”

There are some slamming doors and bedroom misadventures as well as star-crossed suitcases and taxi rides, though the mechanics of this farce also involve cell phones as the modern gateways to confusion. The ambitions, emotions and pretensions of a number of other characters are exposed and involved, including the real guest speaker—an expert in the “scientific management of science” whose convictions as well as disposition make him peculiarly vulnerable to chaos.

  Another contemporary mechanism of farce working here is the space-erasing jet engine, which makes the difference between places (Skios or skiing in Switzerland) way too easy to miss. What the novel adds to stage farce is getting inside the characters’ heads to learn the precise nature of their delusions, and the yearnings and weaknesses that feed them. How they—and we—tend to interpret the world from a few tellingly misunderstood clues is deliciously described.

 This novel extends not only from Frayn’s plays, previous novels and journalism but from his philosophical work (The Human Touch) with its insight that: “The world plainly exists independently of us—and yet it equally plainly exists only through our consciousness of it.” When the two factors collide you may have drama and tragedy, or comedy and farce. In this book, much of the farcical humor as well as character revelation resides in what people believe (and why they believe it) as contrasted with how things really are--or at least, how others believe they are.

Even with satirical touches, Frayn creates a convincing world so endearingly vulnerable to this kind of mayhem that farce seems inevitable, yet you do find yourself rooting for the irredeemably irresponsible protagonist to get away with it. There are sweet reminders of Kosinski’s Being There (as well as Ferris Bueller’s Day Off) as events wind and unwind in this fragile oasis of uncertain civility where bored but lionized experts speak to rich, dutiful but bored audiences. Not everything at the foundation is what it seems either, as the routines of greed undermine the supposed maintenance of civilization, in a fiendishly funny finish featuring a goddess.

Monday, June 18, 2012

Bring Me the Head of Philip K. Dick

How to Build An Android: The True Story of Philip K. Dick’s Robotic Resurrection
by David F. Dufty
Henry Holt

Blade Runner, a film that flopped when it was released in 1982, has since become legendary. It was made from a novel by Philip K. Dick (as were at least nine other movies, including Spielberg’s Minority Report and Arnold’s Total Recall.)  A highly productive California-based science fiction writer from the 1950s until his death in 1982, he inspired generations of other writers, notably fellow West Coasters William Gibson, Ursula Le Guin and Kim Stanley Robinson.  He still casts quite a shadow.  A hefty volume of his theories and observations (Exegesis) has recently been published, and the New York Times just ran a long 3-part series on Dick as a "sci-fi philosopher."

  As in Blade Runner, many of Dick’s stories involved androids and “replicants,” and the resulting confusions of identification and identity. This as well as other themes he explored made him a particular favorite among the sci-fi minded in computer, robotics and artificial intelligence fields. One of those was David Hanson, who was startling people with his android-ready heads with lifelike skin that internal motors could shape into realistic facial expressions. Another was Andrew Olney, who was programming robots capable of having realistic conversations as learning aids.

When they met in 2004, the idea to create a Philip K. Dick android was born. In a year, it was a reality. But even though a university lab and the FedEx Institute of Technology in Memphis became involved (the head of which one Halloween conducted his meetings in a Star Trek officer’s uniform)   this complex project was mostly accomplished with very little money by Hanson and Olney, motivated by the technical challenges, the possibilities they imagined, and the kick of recreating Philip K. Dick.

  Author David F. Dufty, himself a postdoctoral fellow at the University of Memphis lab when this was going on, tells a good story about the ups and downs, ins and outs of the project, which involves a fair amount of goofy mishaps, mostly due to the unforeseen (but easily foreseeable) quirks of mundane reality that androids (and their makers) have yet to master. That part of the story results in a conclusion which makes this almost mythic.

For Philip K. Dick was also famous for writing about conspiracies, and for believing they existed in the real world. (Then again, Watergate revelations in the 70s revealed that the government really had been spying on people like him.) So the threads of the story come together in one stranger than fiction event: on a flight to San Francisco for a demonstration at Google headquarters, his android head was left behind in an overhead bin.  Though an airline employee reported it found and on its way to its owner, it never arrived.  The android head of Philip K. Dick was never seen again.

  The geek-oriented or simply tech-curious will enjoy the descriptions of the innovative work that went into the project. With minimal jargon, Dufty makes the tech as well as the personalities part of the story flow, an admirable example for a growing genre covering a dominating industry: adventures in Tech World. He ably integrates Philip K. Dick’s biography and major works into the story, which makes it unique. The story itself reveals the classic relationship of sci-fi and new tech, as well as suggesting how surprisingly far towards the androids of fiction current technology is taking us. In focusing on Hanson (basically a sculptor and a hardware tinkerer) and Olney (a software guy) he chronicles another relationship of art and science.

  The book itself could have used more and better photos and illustrations, but otherwise it’s a fascinating account that transcends its specific subject.

Tuesday, June 05, 2012

Empathy: From Bench to Bedside
Edited by Jean Decety
MIT Press

Animal Thinking: Contemporary Issues in Comparative Cognition
Edited by Randolph Menzel and Julia Fischer
MIT Press

  Years of a rigid interpretation of Darwinian evolution tended to suppress study of qualities like altruism, compassion, cooperative behavior and empathy, since they didn’t fit into the model of an every-organism-for-itself struggle for survival and transmission of genes. Empathy was even an aberration, a human concoction of religion and—often enough by implication--silly women. But the evidence for the existence of these qualities were too obvious to ignore forever, and more nuanced interpretations of natural selection helped bring them into respectable scientific study. So much time has been lost however that this research is still just beginning.

The collection of scholarly essays in Empathy: From Bench to Bedside is such a wide-ranging beginning. Though even the term is still ambiguous, editor Jean Decety ventures a working definition: “...the natural capacity to share, understand, and respond with care to the state of others...” Even the word “natural” is a great leap forward. The topics range from the relationship of empathy to evolution and human nature (Allan Young) and its possible evolutionary role (Sharee Light and Carolyn Zahn-Waxler) to empathy and neuroscience (Abigail Marsh, Jamil Zaki & Kevin Ochsner, Daniel Terman), its role in childhood development (Amrisha Vaish & Felix Warneken; Nancy Eisenberg. Snjezana Huerta, Alison Edwards), to its role in social psychology (Stephanie Echols & Joshua Corell, etc.) and medical care (Johanna Shapiro, Charles Figley, Ezequiel Gleichgerrcht & Jean Decety.)

Beginning with C. Daniel Batson on his “empathy-altruism hypothesis,” some articles outline further tests and issues for future research. This volume demonstrates the current complexity in the field, which seems partly intrinsic to empathy itself, but also to an uncertain range of definitions. But the true significance here may simply be that all this research exists and is expressed and documented in one volume.

Evidence of just how “natural” empathy might be is found in the chapter on primates and other mammals by Frans B. M. de Waal, a pioneer of empathy studies in general. His work on primates has been instrumental in bringing these qualities into scientific discussion. Why hadn’t other researchers found such evidence in primates before? Because, de Waal has suggested elsewhere, they weren’t looking for it. Instead their assumptions about evolution and about animals prevented them from seeing it.

Animal Thinking is another topic that would have been laughed out of science not so long ago. As editors Randolf Menzel and Julia Fischer write in their introduction, studies have gone beyond the question of whether animals think at all, or whether they have any effective memory to more nuanced issues, like how animals might plan and decide for the future.

The first section attacks a topic for which there is much evidence: animal navigation, from honeybees (Menzel) to birds (Verner P. Bingham.) Animal communication is another topic, even more fraught with problems of definition. While all possible manifestations of animal thinking bear upon the human, and in particular the issues raised concerning empathy and cooperation, the final section of this book, on “social knowledge,” confronts these issues directly, inevitably involving group as well as individual behavior.

These studies raise questions about scientific disciplines and future research. J.R. Stevens calls for a synthesis of evolutionary and psychological approaches to decision-making issues, but that seems good advice in general. He notes the emergence of cognitive ecology and evolutionary psychology as pertinent fields. While Derek Penn suggests that folk psychology “ruined comparative psychology,” scientific research would do well to pay attention to the knowledge of indigenous peoples accrued over centuries of intimate relationships and interdependent culture-creation with animals.

These essays are the product of the Ernest Strungmann Forum, and each topic section includes a synthesis or summary authored by a number of scholars. Both volumes are sturdy, no-nonsense, durable books with clear type. They should stand up to frequent reference and repeated readings, as well they should.

Monday, May 28, 2012


The Etiquette of Freedom: Gary Snyder, Jim Harrison, and The Practice of the Wild
Edited by Paul Ebenkamp
Counterpoint

Poet and eco-elder Gary Snyder grew up on a farm in Washington state. When he was a boy he asked his Sunday school teacher if he would meet a beloved and recently dead heifer in heaven. The clergyman said no. “Then I don’t want to go there!” said novelist Jim Harrison from across the dinner table, listening to Snyder tell this story. “That’s exactly what I said!” Snyder exclaimed, explaining that this exchange inspired his interest in other cultures that included the non-human in their moral universe.

It’s a moment in the DVD of a documentary film directed by John Healey that is conveniently tucked in a plastic envelope at the back of this book. So to the music CD/concert DVD packages and all the other hybrids, add the book-with-the-movie. The book is marketed as conversations between “old friends” Snyder and Harrison, but in fact it is culled from the making of a film that is essentially about Gary Snyder, in which Harrison (a co-producer) participates. Also interviewed are poet Michael McClure (recalling Snyder from both the Beat and Summer of Love eras in San Francisco), poet (and Snyder's first wife) Joanne Kyger, academic Scott Slovic and others.

The film was keyed to the 20th anniversary of Snyder’s landmark book of essays, The Practice of the Wild. (I got that book when I spotted a used copy shortly after it was placed in the window of the old Arcata Bookstore. “I knew it wouldn’t stay there long,” the owner said.) This book includes a transcript of the movie, but also a lot more, including Snyder-Harrison conversations in which Harrison comes off a lot better than he does in the film. It finishes with some Snyder poems and a chapter from The Practice of the Wild.

Snyder seems aware that the movie allows him to correct the record on some things.  Jim Harrison suggests that what distinguishes Snyder's early poems, especially in Rip Rap (his first book) is the reality and rhythm of work.  (Snyder worked as a fire lookout, a logger and earlier as a merchant seaman.)  This insight apparently is not exclusive to Harrison--in the book Snyder mentions a doctoral thesis he's read about the relationship of his poems and work, which he thinks is overblown. In the movie he allows that "part of it was the work," but that he had a literary purpose.  He'd been studying classical Chinese which is "strongly monosyllabic" and decided to experiment with monosyllabic English words that came before the Norman influence.  He wanted to use this "old style language," these words that are as hard as rocks, to build "a little rock trail" of language.  One can add that this particularly fit his subject for that book.

The title of the film was the same as the book The Practice of the Wild, and the movie has an interesting moment when Snyder brings the conversation back to the book, as if that purpose seemed to be slipping away.  In any case he mentions that certain distinctions he made in that book have not caught on, even with environmentalists: nature as the totality of what exists, wild as the natural process, and wilderness as the place where the wild predominates.  It's a point that Jack Turner later made in his book, The Abstract Wild,  especially as it relates to Thoreau's famous statement, "In wildness is the preservation of the world."

The book and movie discuss Snyder's experiences with Zen and Asian thought.  He talks about his Zen training in Japan, and quotes one of those great paradoxical Zen guides: "The perfect way is without difficulty.  Strive hard!"  In the book Snyder mentions having tea with Shunryu Suzuki, who started the San Francisco Zen Center.  Suzuki thought his students were too serious--he wanted them to have a sense of humor.  I laughed reading this, because I love Suzuki's books (and books about him) for their wonderful humor.

However, despite extolling Snyder's Pulitzer Prize winning book, Turtle Island (a name among some Native Americans applied to North America, which has since become an accepted name), there is nothing in the movie--and only a little in the "outtakes" of the book--about his work on Native American beliefs and cultures, which is a somewhat controversial subject (some Native writers, notably Leslie Marmon Silko, criticized him for appropriations) but still is an important element in any discussion of America and its natural world.  There also was mention of the bioregionalism movement in the movie, but not enough--especially in relation to The Practice of the Wild.

The DVD also has extras, including more from the interviews, and more of Snyder reading his poems. All this is good, because the movie itself is on the thin side. Still, the DVD provides the sense and sound of these people, and you get a little companionship along with the substance that is primarily available in the handy form of readable words in the book. And there are those filmed moments, like the one that begins this review, that really help illuminate Gary Snyder and his work.

But the book as a book is a little too much like an adjunct to the movie.  There is no index, no bios for the other people quoted in it, and the introductions are about making the movie, not about what's in the text. There is a list of books by Snyder (I've got something like 17) in the back, but the only book about him I've got isn't mentioned:  Gary Snyder: Dimensions of a Life, edited by Jon Halper (Sierra Club Books) which is a generous collection of essays about him by friends and family. (The cover is a painting by Haida artist Robert Davidson, who thought he was contributing it for a memorial volume.  When I saw it at his studio and reassured him that Snyder was still alive, he joked "I mourned for nothing.")

 For those interested in his prose on subjects that whiz by in this movie and book, there's of course The Practice of the Wild (1990) but also Earth House Hold (a founding document of the ecology movement in the 50s and 60s--the word "ecology" can be translated as Earth household), The Old Ways (1977), The Real Work (1980) and A Place in Space (1995) as well as parts of more recent books.

Still, in the end this book/movie is a garden of delights: interesting conversation, recollections, images and Gary Snyder reading his poems, which may not be essential to appreciating them but it adds a special dimension. I once had the privilege of hearing him read for several hours at a time, several days in a row, a revelatory experience. Now I can pop this into the DVD any time I want! Thanks to this book.

As I mentioned, this book is marketed as a conversation between Gary Snyder and Jim Harrison, which made me curious for so long that I finally bought it.  Though it isn't that, I'm not really disappointed.  I wouldn't mind reading the transcripts of those conversations, though.  But a conversation I'd really like to overhear would be between Gary Snyder and another writer who taught at the University of California at Davis at the same time as he did: Kim Stanley Robinson.

Saturday, May 19, 2012


Van Gogh: Up Close
Edited by Cornelia Homburg
Yale University Press

He is among the most mythologized of artists: the tortured, mad, solitary, ignored genius. As if his paintings are not striking enough, the severed ear and his suicide color them more strangely in our cartoon world. This calm and excellent volume is the antidote, as well as a revelatory look at the heart of his work.

Guided by his childhood in the Dutch countryside and his love of Japanese prints, Van Gogh “pushed the boundaries of close-up views of nature” as no other European artist, according to Cornelia Homburg, this volume’s editor, and a curator of the accompanying exhibition, now at the National Gallery of Canada.  Eleven illustrated essays examine subjects and sources. For example, Joseph J. Rishel looks at his early encounters with Dutch Masters such as Rembrandt. Jennifer Thompson surveys his contemporaries, including Monet and Renoir. But most of the attention stays on Van Gogh himself, and along the way the myth gets modified.

While some modernists may have pursued the Rimbaud formula of deliberate disordering of the senses, Van Gogh considered “peace of mind and self-composure” essential to the concentration on the “dusty blade of grass” and the process of painting well. "During periods of stress Van Gogh reinforced his efforts to stay calm, as he considerable a stable frame of mind essential for any painter." While some have suggested Van Gogh’s illness was due to the intensity of his work, Richard Shiff in his essay writes that it was the other way around: only painting soothed him and offset his illness. 

   Elements such as angles and how subjects were framed may have been generally influenced by early photography, though the composition of photographs tended to follow painting as well. But Van Gogh wasn't interested in photographic realism. He saw the use of color and texture essential to painting.   Van Gogh’s use of color also doesn’t require theories about warped brain chemistry—Jennifer Thompson writes that he consciously used color to convey feeling and meaning. He wanted to suggest the sound of the ear of corn swaying in the breeze.

This book also suggests that Van Gogh was not so universally ignored in his time.  At the time he was painting these "close-ups" of nature in the south of France, his latest work was receiving very favorable attention from his fellow artists.  Both Dutch and French journalists were writing about him, and the first major article about his work by a distinguished critic appeared in 1890.  (Van Gogh however still didn't think his work was really understood.) 

   Illustrations abound in this book. In my experience, there’s nothing like seeing the actual painting. I had the chance to stand quietly in front of Van Gogh’s “Wheat Fields at Auvers Under Cloudy Sky” for a considerable time at the Carnegie Museum of Art in Pittsburgh, and no illustration I’ve seen—including the one in this book—captures the luminous colors and textures, or their effects. But quite a few of the 100 illustrations in this volume are the best I’ve seen, and give a real sense of a painting. Several are simply astonishing. That’s due in part to the large format (a few are spread over two wide pages) and the care with color. They certainly are much better than anything now available online.

“Art requires a sense of humility, willingness to perform hard labor, and an ability to see beauty in the most humble of places,” writes Anabelle Kienle, summarizing a Van Gogh letter. The combination of these texts and illustrations gives substance and a sense of joy to his achievement. And together they may illuminate the flowers, trees, fields and skies of this spring in new ways as well.

Saturday, May 05, 2012


Lone Survivors: How We Came To Be The Only Humans On Earth
By Chris Stringer
Times Books

There’s an illustration I recall from an old schoolbook that still defines the popular conception of human evolution (at least for those who believe in it at all.) It depicted a progression of primates, hairy apes becoming slightly less hairy and more two-legged until the crouching Neanderthals become the modern human, upright and standing tall, poised to invent subprime mortgages. It’s a tale of destiny and inevitable progress, and it’s pretty much wrong.

Chris Stringer is a prominent paleoanthropologist affiliated with the Natural History Museum in London. In this book he attempts a comprehensive survey of the still-changing picture of human species development. He begins with the state of knowledge in the 1970s, when he started his professional work, and he describes in some detail the factors involved in how that picture has changed. He explores fast-moving advances in contributing fields: not only in new techniques for locating and unearthing fossils, but in dating them with exotic new technologies (like the synchrotron, of which a very big example is the Large Hadron Collider.)

  Genetics now contributes in various ways, advancing with stunning speed. The human genome was essentially sequenced just 20 years ago. Now there’s a sequenced Neanderthal genome. Stringer also considers cultural questions based on artifacts such as tools, paints and musical instruments.   I found a lot of this interesting but also frustrating.  Paul Shepard shows--and evidence from contemporary Indigenous peoples as well as their traditions demonstrate--that humans learned a lot by imitating nature and other animals, as well as through their interactions with animals (in the hunt, for example, which Shepard suggests influenced the intelligence of both hunter and prey.)  Some cultural aspects Stringer finds mysterious may become less so if these factors are fully considered. 

Stringer writes a lot about how information was developed, so the big picture emerges in fragments. While all the new data answers some questions, basically it seems to have complicated the story. It’s now considered likely that several of many human species (an earlier book counted 22) coexisted in the same time, maybe in the same place. Modern humans carry some Neanderthal DNA (and before the caveman jokes start, Neanderthals males and females may have been more equal physically and culturally than are modern humans.)

New early humans have been discovered, notably a species called homo floresiensis discovered in 2004. Because of its small size, it was quickly dubbed “the Hobbit.” This species seems to have existed only on a single island near Java, and remains mysterious and very provocative in what it suggests about the vagaries of evolution. These humans existed—and died off--apparently in isolation perhaps just 18,000 years ago.  This suggests the fragility of a species' survival.  But then, for one reason or another, our own existence outside the Africa whence we came may be owed to only a few hundred surviving travelers.

    So how did we become the only humans on earth? Did modern humans develop traits that gave it competitive advantage through natural selection? In some ways that's likely, but traits that survived for no discernable reason (genetic drift) also helped. I’ve noticed that in recent years, historians are taking role of climate more seriously as a causal factor in the rise and fall of civilizations. Similarly, this book describes climate changes as crucial elements in the prehistoric story of human species. It may well be that one reason our species survived is that while other human species battled with extreme and disastrous effects of climate change, our forebearers were in southern Africa, with relatively stable climate.  Ironic perhaps in that this area is now being hit hard by today's climate crisis.

There are still plenty of puzzles, but Stringer concludes that we’re here at least partly by accident, by luck. “Sometimes the difference between success and failure in evolution is a narrow one,” he concludes, and notes that we’ve now got “an overpopulated planet and the prospect of global climate change on a scale that humans have never faced before. Let’s hope our species is up to the challenge.”

Tuesday, April 24, 2012


Psychology’s Ghosts: The Crisis in the Profession and the Way Back
By Jerome Kagan
Yale University Press

Contemporary psychological research is too flawed in its premises and procedures to really prove what it says it proves. So it may provide an errant basis for diagnoses and treatments, or the grand explanations of human thought and behavior derived from it, perhaps as expressed in such books as Daniel Kahneman’s Thinking Fast and Slow (reviewed here below.)

Such a critique comes from a psychologist so eminent (and so old) that the castigation it invites from the operationally arrogant psychological establishment won’t hurt him. Since it’s by Jerome Kagan, Harvard professor of psychology Emeritus and a distinguished author, it might even be considered. Though I don’t think even he is counting on it.

Kagan isolates four problems, which come down to overconfidence in a fatally limited set of assumptions. He notes for example that psychologists often ignore differences in their human subjects, such as age, cultural background and class, as well as the setting. “Too many papers assume that a result found with forty white undergraduates at a Midwestern university responding to instructions appearing on a computer screen in a small, windowless room would be affirmed if participants were fifty-year-old South Africans administered the same procedure by a neighbor in a larger room in a familiar church in Capetown.”

This is not drollery: American university students of European background were the main subjects for more than 2/3 of the papers published in six leading journals between 2003 and 2007. There are usually a small number of participants, yet universal conclusions are offered.

  “Missing Contexts” is one such problem. Another is assuming everyone shares your definitions. Kagan finds that psychologists assumed that “happiness” always means the kind of self-aggrandizement that appeals to Americans. “Not one of the seven greatest pleasures listed by one American writer...referred to acts that helped another.”  Yet other cultures “celebrate states of serenity, the quality and obligations linked to personal relationships, and social harmony.”

  Another problem is inferring too much from a single measurement instead of a pattern. Psychologists too often ignore social class in assessing “symptoms,” and can be too quick to classifying a trait or behavior as an illness, regardless of origin or personal difference. This is more dangerous now that drugs with serious side effects are so quickly prescribed, and all but forced on some children whose high spirits may become hyperactivity, whose sadness is defined as depression, and whose shyness becomes social phobia.

Kagan is much more thorough and precise in this remarkable book. He has a chapter of positive recommendations, but as he notes, he’s not the first to point out these limitations, that have so far mostly been ignored. What he’s basically calling for is some humility, and acknowledgement of complexity, differences and connections.




I'll tell you why these conclusions particularly resonate with me.  One of the most famous and most often quoted lab psychology experiments of modern times is the Milgram experiments of the 1970s, in which participants were instructed to give electric shocks to people on the other side of a wall if they answer questions incorrectly.  With each incorrect answer the shock is intensified, and the victim can be heard screaming in pain.  The victims weren't actually getting shocks--they were in on the con.  The experiment was to see how many people will follow instructions and administer the shocks.  The answer was a shockingly high percentage of them.  I first heard it reported as being 100%.  Later the figure given for those willing to administer the most voltage (after the experiment had been repeated several times) was 64%.

This experiment was said to prove two main points: that people are willing to do what authority tells them to do, and that people will do so in that situation even if they believe that they wouldn't.  This in turn has led to a new conventional wisdom. For instance (based on this and other experiments) that people will take on characteristics of their role in a system regardless of their personal ethics--for instance, prison guards will always be sadistic towards prisoners.   

Though the Milgram experiments were conducted in various ways and places over the years, enough to apparently satisfy Kagan, I have good reason to doubt that they prove any of that.  For I was in New Haven in 1970, where and when the experiments were being done, and I answered an ad in the paper that (according to Elliot Aronson's description) was for participants in this experiments.  My motivation to answer the ad was that it promised to pay $25.  I was "out of college/money spent" and checking out the New Haven scene.   But I called first and asked questions about the experiment.  I was actually concerned about whether drugs were involved.  I believe I was assured they weren't, but the answers I got about the experiments were otherwise so vague and deceptive that I immediately became suspicious that this wasn't on the up and up.  (And in fact it wasn't--the object of the experiment was not what the ad said: investigating how people learn and remember.) 

So who were the participants?  They were people who (like me) needed or wanted $25, which leaves out a lot of people, but if you needed the money you might well be reluctant to refuse what you were being told to do to earn it.

But participants also were not people who (like me) decided not to participate because it didn't smell right.  In other words, they were people predisposed to put themselves in the hands of academic/scientific authority figures without question.  Hardly surprising that they would then do what these authorities told them.

  So I'm not sure this was a true random sample of the population.  Furthermore, the results never smelled right to me either.  They were supposed to show how nearly everybody will obey an authority figure, to the point of causing pain.  But this was 1970--during the Vietnam war, just months before the Cambodia invasion and the massive campus protests in New Haven as elsewhere, ending in deaths at Kent State and Jackson State.  Antiwar politics and radical politics were rife among students in New Haven. The Berrigans were spiritual leaders in New Haven. It was precisely the time and one of the places where refusing the dictates of authority was happening all the time. And it was precisely the time when causing people pain--bombing them, torching their villages, and sending naive young Americans to kill and die-- was a very big issue.

  So I have to wonder--just how many antiwar protestors, draft resisters and conscientious objectors answered that ad?  Nixon was President! Students were burning their draft cards. Ken Kesey, Timothy Leary, Abbie Hoffman, SDS, sex, drugs and rock & roll were culturally ascendant-- If there was ever a time and place where disobeying authority was culturally, politically and morally prominent, this was it.

So it's more than just possible that a great many people who would laugh at the guy in the white coat, tear the wires out of the machine and spit on their clipboards before singing "Alice's Restaurant" on the way out, just didn't answer that ad.  And that's pretty much what that experiment proves. Missing context indeed.

Friday, April 13, 2012

Thinking, Fast and Slow
By Daniel Kahneman
Farrar, Straus and Giroux

How does the human mind work? Each publishing season unleashes another cascade of books addressing that subject from a wide array of perspectives. This recent one got a lot of attention, partly because the author won the 2002 Nobel Prize in Economics for his work on decision-making.

Based on an experimental psychological approach, Kahneman’s premise is that we think basically in two ways: our “automatic system” of intuition and emotion (the “thinking fast” of his title) called System 1, and the more effortful system that employs logic, calculation and deliberation (“thinking slow”) called System 2.   

He narrates how these work very well at first. But then he explains that he doesn’t continue using the “more descriptive” term “automatic system” because it takes longer to say than “System 1” and so “takes up more space in your working memory” which might distract you. Really? And it is less distracting than the extra step of trying to remember what System 1 stands for, each time it comes up?

I also wonder if he’s playing fair. He gives a quick math puzzle and instructs the reader: “Do not try to solve it but listen to your intuition.” Intuition gives most people the wrong answer, and that prompts him a page later to use this as evidence of “a recurrent theme of this book,” that people “place too much faith in their intuitions.” Or maybe too much faith in following instructions?

Apart from such quibbles that may (or may not) have more to do with the experience of reading than of the science that is the book's subject, what’s his basic point? Because the automatic system usually predominates, people make bad decisions, basically because they don’t calculate risks or otherwise employ statistical thinking but go with their cognitive biases. This makes for poor business strategies, bad stock market choices and lousy vacations.

This has some explanatory appeal (maybe rationalizing Republicans, or economists) but he seems to treat as axioms the propositions that effortful thinking is basically statistical, and statistics are the royal road to truth and happiness. Both my Systems tell me this is too limited. Relying on statistics doesn’t always work that well for science, let alone life. Statistics tells you about probabilities and groups. They deal well with at most two or three variables. Life involves the unique interplay of much more.

There’s some healthy skepticism about “expert intuition” which he equates with guessing, but not a very sophisticated view of intuition as holistic apprehension.  Some of the studies are provocative but the methodology is often suspicious and the conclusions too sweeping, at least for the evidence he presents.  Some of his insights are very interesting but as a reader I'm not convinced by his argument, evidence or tone enough to have a general confidence in the content.  Even though he appears to be more self-aware than many who write on these topics, his writing persona can be pretty smug.

 Kahneman’s style can be engaging, and there’s much to explore in these many pages--and a lot to argue with-- if as a non-specialist you can see the point of navigating the math. Despite the generous if not fulsome praise by academic heavyweights displayed on its back cover (one placing this book in the company of Freud’s and Adam Smith’s landmark works, another calling Kahneman “one of the greatest psychologists and deepest thinkers of our time”) my statistically insignificant experience was of much ado about not enough.

Friday, March 23, 2012


Lives of the Novelists: A History of Fiction in 294 Lives
By John Sutherland
Yale University Press

The novel arose with the literate middle class that it was designed to please and ultimately to reflect. From its beginnings in the 17th and 18th century it was the most egalitarian of literary forms, mashing up everything from the high art of epic poetry and the high aspirations of religious texts to the common trade of letters, journals, folk tales, political pamphlets and popular accounts of exploration and adventure. Through the years it became the most elastic and universal form, linking the world in story.

One way to tell the novel’s history could be through the lives of novelists. British professor, columnist and critic on radio and TV John Sutherland writes two to four breezy pages about 294 of his favorites. Historically they range from John Bunyan and Samuel Richardson in the 1800s to Paul Auster’s latest in 2008.

As for accuracy and judgment, I can only spot-check by what I already knew. Jacqueline Susann’s novels were indeed assembled by hired pens (I met one), but I’m not sure I would call Kurt Vonnegut’s architect father “successful,” and even a brief bio of John Barth seems incomplete without mentioning the novel that made him famous, Giles Goat-Boy. The life I know the most about—H.G. Wells—suffers seriously from this summary. But these are personal essays, not Wikipedia refereed entries. Astringent, ironic, breezy, cynical and lyrical in turns, individually they are not boring or unbiased.

Collectively, the question is do they tell the story of the novel? The early entries are promising, as the novel form is unintentionally assembled from individual obsessions and reactions to them, often expressed in parody. The 19th century was the novel’s high point as popular entertainment as well as literature, and here Sutherland especially hits his stride in combining individual biography, its echoes in the writer’s work, and the social and cultural context. His Dickens entry is excellent.

When he gets to the 20th century however, the portraits seem more sensational and less literary. Gossip can be made into literature, but gossip is not literature, nor much of a key on how literature is made. Sutherland restricts himself to English language writers and so huge influences like Kafka and Marquez are absent, as well as English-writing innovators like Sinclair Lewis, Thomas Pynchon and Doris Lessing.

Still, the inclusion of writers unknown today (many of whom were popular in their time, especially women) and genre novelists (westerns, crime, romance, science fiction) add crucially to the historical narrative. They help demonstrate the vitality of the novel form in all its wildness as well as its polish. That wildness not only reflects life but also helps the novel (the word simply means “new”) continue to surprise.  

Friday, March 09, 2012


Searching for Utopia: The History of An Idea
by Gregory Claeys
Thames & Hudson

When I saw this book displayed at Northtown Books here in Arcata, it reminded me that while the once promising field of Future Studies has waned, there’s a curious new interest in Utopia Studies. In this era of dire predictions and popular fictions of apocalyptic futures it seems counter-intuitive, but it is precisely in dark times that utopian visions flourish.

This volume is one in a series of illustrated histories, but the pictures are less impressive than the prose. Claeys debunks several persistent mischaracterizations about the literature of utopia, first and foremost, that “utopia” necessarily means a perfect world. Most utopian stories are about a “radically improved” society. Like the story that gave the idea its name—Thomas More’s Utopia—it often responds to what we might call the tyranny of the 1%, and depicts a more egalitarian society.

But utopian stories vastly predate More’s 16th century work, and appear around the world, from indigenous cultures to Chinese, Hindu and Muslim civilizations.  Many have religious roots and hark back to a mythical Golden Age. That changes with H.G. Wells and other modern writers who begin locating utopia in the future, now the dominant notion.

Utopia was often located on an island or a hidden place, like Shangri-la. In Thomas More’s time, America was the hoped for place where utopia could happen, and utopian ideals drove many actual political and social experiments, from the founding documents of the United States to hundreds of communities organized on utopian lines in the 19th century.

Some utopias turned very dark, especially when linked with scientific pseudo-theories (like eugenics) and technology. When attempted by murderous dictatorships, the catastrophic results poisoned the very idea of utopia. And so psychology as well as politics enters the utopian story. External change is not enough; self-knowledge becomes a utopian endeavor.   This survey of course can't include everything, but it should be noted that among psychologist writers who addressed this matter of approaching a better society through individual self-knowledge were Carl Jung and his American successor, James Hillman.  From another point of view, the Dalai Lama has spoken and written about this subject specifically.

Claeys’ survey of science fiction—the chief generator of utopian fictions in modern times—is cursory and not particularly insightful.  It brushes by the particular contributions of Ursula LeGuin, George Zebrowski, Greg Bear, Bruce Sterling, William Gibson and other contemporary science fiction and fantasy writers, but in particular Kim Stanley Robinson, who is especially known for his utopian concerns, and who edited a volume of utopian stories.  But worst of all, the world’s most widely known utopian saga of the past half century isn't even mentioned: Star Trek. This is a glaring omission, even for a Brit.  Then again, he ignores the most recent incarnation of Doctor Who, which confronted this topic in a number of stories, including one which involved the quest for a mythical planet called Utopia.

The eloquent final chapter examines the present, when the response to onrushing ecological disaster caused by our civilization is to shop harder. He concludes: “The old ideal worlds can lend us hope, inspiration, a sense of what to aspire for as well as what to avoid. But our ideal world must be very much our own creation, and a serious reckoning with the fate we face if we fail to create it.”

Utopia, like hope, is an activity of the present.  Zebrowski and Kim Stanley Robinson are particularly insistent on utopia as a process, which is an analogue to President Obama's goal of a perfecting what can never be perfect, that "more perfect union."  Utopia is a process of imagination and effort, motivated by basic human impulses, including love for future generations and our planet.