On being human: Lessons from Harvard Institute for Management and Leadership in Higher Education

In a specific sense I am an unabashed advocate of what has come to be called the applied humanities, roughly and broadly speaking the effort to connect the study of the humanities to identifiable and practical social goods.  For me, in addition it includes the effort to develop humanities programs that take seriously that we are responsible (at least in significant part) for preparing our students for productive lives after college, preparation that I think really should be embedded within humanities curricula, advising, cocurricular programming, and the general ethos and rhetoric that we use to inculcate in our students what it means to be a humanist.

In several respects this conviction lies at the root of my advocacy for both digital humanities programs and for career planning and programming for liberal arts students, as different as these two areas seem to be on the surface.  I have little room left any more for the idea that “real learning” or intellectual work pulls up its skirts to avoid the taint of the marketplace or the hurly-burly of political arenas and that we demonstrate the transcendent value of what we do over and above professional programs by repeatedly demonstrating our irrelevance.  Far from diminishing the humanities, an insistence that what we do has direct and indirect, obvious and not so obvious connections to social value enhances the humanities.  It’s not just a selling point to a doubting public.  As I said yesterday, the only good idea is the idea that can be implemented.  We ought to be proud of the fact that we can show directly how our students succeed in life, how they apply the things they’ve learned, how they find practical ways of making meaningful connections between their academic study and the world of work.

At the same time, I will admit that some versions of this argument leave me cold.  It risks saying that the only thing that is really valuable about the humanities is what is practically relevant to the marketplace. I greet this effort to make Wordsworth a useful version of a management seminar with a queasy stomach.

It may sound like a nice day out in beautiful surroundings, but can walking around Lake District sites synonymous with Romantic poet William Wordsworth really offer business leaders and local entrepreneurs the crucial insights they need?

That is precisely the claim of Wordsworth expert Simon Bainbridge, professor of Romantic studies at Lancaster University, who believes the writer can be viewed as a “management guru” for the 21st century.

Since 2007, the scholar has taken students down into caves and out on canoes to the island on Grasmere once visited by Wordsworth and fellow poet Samuel Taylor Coleridge, and to places where many of the former’s greatest works were written, for what he called “practical exercises linked to the themes of Wordsworth’s poetry.”

Such walks, which also have been incorporated into development days for individual firms, are now being offered as a stand-alone option for local and social entrepreneurs at a rate of £175 ($274) a day.

Read more: http://www.insidehighered.com/news/2012/08/09/businesses-pay-british-professor-teach-them-about-wordsworth#ixzz236bQaECf 
Inside Higher Ed 

I do not find the insight here wrong so much as sad.  If the only reason we can get people to read Wordsworth is because he will enhance their management skills, we have somehow misplaced a priority, and misunderstood the role that being a manager ought to play in our lives and in the social and economic life of our society.  It is the apparent reduction of all things and all meaning to the marketplace that is to be objected to and which every educational institution worthy of the name ought to furiously resist, not the fact of marketplaces themselves.

I was lucky enough this summer to attend the Harvard Institute for Management and Leadership in Education.  To be honest, I went thinking I was going to get all kinds of advice on things like how to organize projects, how to manage budgets, how to promote programs, how to supervise personnel.  There was some of that to be sure, but what struck me most was that the Institute, under the leadership of Bob Kegan, put a high, even principal, priority on the notion that managers have to first take care of who they are as human beings if they are to be the best people they can be for their colleagues and their institutions.  You have to know your own faults and weakness, your own strengths, your dreams, and you have to have the imagination and strength of mind and heart (and body) to learn to attend to the gifts, and faults and dreams and nightmares of others before or at least simultaneously with your own.  In other words, being a better manager is first and foremost about becoming a healthier, more humane, fuller human being.

The tendency of some applied humanities programs to show the relevance of poetry by showing that it has insights in to management techniques, or the relevance of philosophy because it will help you write a better project proposal, is to misplace causes and to turn the human work of another imagination (in this case Wordsworth) into an instrumental opportunity.  The reason for reading Wordsworth, first and foremost, is because Wordsworth is worth reading, and simultaneously because the encounter with Wordsworth will give you the opportunity to be a fuller, more imaginative, more thoughtful human being than you were before.

If you become that, you will have a chance to be a better manager.  But even if you don’t become a better manager, or if you lose your job because your company is overtaken by Bain capital or because students no longer choose to afford your pricey education, you will, nonetheless, be richer.

Advertisements

Dreaming of Sleep: Madison Smartt Bell’s Doctor Sleep

It’s a pleasure to pick up a novel and know from the first lines that it will be worth the read. We usually have to give more of ourselves over to novels than to other forms of writing–a problem in our frantic clickable age. With stuff that I remand to my Instapaper account, I can glance through the first paragraph and decide if I want to keep reading, and if I read three or four paragraphs and am not convinced it’s worth the time, there’s no point in agonizing about whether to keep on going. Same with a book of poetry: If I read the first three or four poems and there’s nothing there other than the authors reputation, I mostly put it aside–though I might admit that it is my failing and not the poets.

But novels are a different horse. A lot of novels require 50 pages and sometime more before we can get fully immersed in the writer’s imaginative world, feeling our way in to the nuances and recesses of possibility, caring enough about the characters or events or language or ideas (and preferably all four) to let the writer land us like netted fish. I think I’ve written before about the experience of reading yet one more chapter, still hoping and believing in the premise or the scenario or the author’s reputation. I’ve finished books that disappointed me, though my persistence was more like a teenaged boy fixed up on a date with with the best girl in school, pretending until the evening clicks to a close that he isn’t really bored to tears, things just haven’t gotten started yet.

But Madison Smartt Bell’s Doctor Sleep didn’t make me wait. I bought the book on reputation and topic. I loved Bell’s All Souls Rising, but got derailed by the follow-ups in his Haitian trilogy, never quite losing myself in the Caribbean madness that made the first book the deserved winner of an array of awards. Thus disappointed, I hadn’t really picked up Bell’s work since, though I vaguely felt I ought to. From the first sentences of the novel, his first, I was under the spell. The choice of words is purposeful since the book is about a hypnotherapist who, while helping others solve all manner of problems and difficulties through his particular gift at putting them under, cannot neither solve his own problems or put himself to sleep: He suffers from a crippling case of insomnia.

Like any good novel, the meanings are thickly layered. In some respects I found myself thinking of the Apostle Paul’s dictum that wretched man that he was, he knew what he should do, and he wanted to do it, but he could not do the very thing he knew to do, and, indeed, the very thing he did not want to do this was the very thing he did. The tale of all things human, the disjunction between knowledge and will, between thought and desire and act. The main character’s skills as a hypnotist are deeply related to his metaphysical wanderings amidst the mystics and heretics of the past, most particularly Giordano Bruno, burned at the stake because he claimed the church sought to promote good through force rather than through love. Ironically, the main character knows all this, knows in his head that love is the great mystic unity of which the mystics speak, and yet turns away from love in to abstraction, failing to love women because he cannot see them as human beings to whom he might be joined, seeing them instead as mystic abstractions through which he wants to escape the world.

In the end, accepting love means accepting death, which means accepting sleep–something that seems so natural to so many, but if you have suffered from insomnia as I do, you realize that surrendering to sleep is a strange act of grace, one that cannot be willed, but can only be received.

I think in some ways to there’s a lot of reflection in this book on the power of words and stories, their ability to put us under. So, perhaps inevitably, it is a book about writing and reading on some very deep level. Adrian, Doctor Sleep, takes people on a journey in to their unconscious through words, and his patients surrender to him willingly. Indeed, Adrian believes, with most hypnotists, that only those who want to be hypnotized actually can be. This is not so far from the notion of T.S. Eliot’s regarding the willing suspension of disbelief. I do not believe Adrian’s metaphysical mumbo-jumbo, but for the space of the novel I believe it utterly. We readers want to be caught. We want to lose ourselves at least for that space and that time, so that reading becomes a little like the gift of sleep, a waking dream.

Under the spell of writing we allow Bell to take us in to another world that is, surprisingly, like our own, one in which we see our own abstractedness, our own anxieties, our own petty crimes and misdemeanors, our own failures to love.

The Best of Times, the Worst of Times: The U of Missouri Press is closing, Jennifer Egan is Tweeting

A bad week for publishing, but sometimes it seems like they all are.  First I was greeted with the news that the New Orleans Times-Picayune has cut back circulation to three days a week.  Apparently later in the same day, three papers in Alabama announced a similar move to downsize and reduce circulation.  Apparently being an award winning newspaper that does heroic community service in the midst of the disaster of a century is no longer enough.

Then today’s twitter feed brought me news of another University Press closing.

University of Missouri Press is closing after more than five decades of operation, UM System President Tim Wolfe announced this morning.

The press, which publishes about 30 books a year, will begin to be phased out in July, although a more specific timeline has not been determined.

Ten employees will be affected. Clair Willcox, editor in chief, declined to comment but did note that neither he nor any of the staff knew about the change before a midmorning meeting.

In a statement, Wolfe said even though the state kept funding to the university flat this year, administrators “take seriously our role to be good stewards of public funds, to use those funds to achieve our strategic priorities and re-evaluate those activities that are not central to our core mission.”

via University of Missouri Press is closing | The Columbia Daily Tribune – Columbia, Missouri.

Plenty has been said about the worrisome demise of daily papers and what the transformation of journalism into an online blogosphere really means for the body politic.  Will the Huffington Post, after all, actually cover anything in New Orleans if the paper goes under entirely.  Reposting is still not reporting, and having opinions at a distance is great fun but not exactly a form of knowledge.

The demise of or cutbacks to university presses is less bemoaned in the national press or blogosphere, but it is still worrisome.  Although I am now a believer in the possibilities of serious intellectual work occurring online, I am not yet convinced the demise of the serious scholarly book with a small audience would be a very good thing.  Indeed, I believe the best online work remains in a kind of symbiotic relationship with the traditional scholarly monograph or journal.  I keep my fingers crossed that this is merely an instance merely an instance of creative destruction, and not an instance of destruction plan and simple.

On a more hopeful note, I will say that I thoroughly enjoyed the New Yorker tweeting Jennifer Egan’s latest story Black box and am looking forward to the next installments.  I’d encourage everyone to “listen in”, if that’s what you do on twitter, but if you can’t you can read it in a more traditional but still twitterish form at the New Yorker Page turner site.  To get the twitter feed, go to @NYerFiction.  The reviews have been mixed, but I liked it a great deal.  Egan is a great writer, less full of herself than some others, she has a great deal to say, and she’s willing to experiment with new ways to say it.  Her last novel, Waiting for the Goon Squad, experimented with Powerpoint like slides within the text.  And, there’s a nice article over at Wired about the piece, suggesting it may be signaling a revival of serialized fiction.

Let’s hope so, it will make up for the loss of the U of MIssouri Press, at least today.

Digital Humanities, “techno-lust”, and the Personal Economy of Professional Development: Ryan Cordell on simplification

It’s a sign of my unsimple and exceedingly overstuffed life that I’ve only now gotten around to reading Ryan Cordell’s ProfHacker piece from last week.  Ryan is moving to a new position at Northeastern (kudos!) and he’s taken the ritual of eliminating the clutter and junk of a household as a metaphor for the need to prioritize and simplify our professional practices and in his own instance to get rid of the techno gadgets and gizmos that curiosity and past necessity have brought his way.

I have confessed before my appreciation for Henry David Thoreau—an odd thinker, perhaps, for a ProfHacker to esteem. Nevertheless, I think Thoreau can be a useful antidote to unbridled techno-lust. As I wrote in that earlier post, “I want to use gadgets and software that will help me do things I already wanted to do—but better, or more efficiently, or with more impact.” I don’t want to acumulate things for their own sake.

…..

I relate this not to brag, but to start a conversation about necessity. We talk about tech all the time here at ProfHacker. We are, most of us at least, intrigued by gadgets and gizmos. But there comes a time to simplify: to hold a garage sale, sell used gadgets on Gazelle, or donate to Goodwill.

via Simplify, Simplify! – ProfHacker – The Chronicle of Higher Education.

Ryan’s post comes as I’ve been thinking a lot about the personal economy that we all must bring to the question of our working lives, our sense of personal balance, and our personal desire for professional development and fulfillment.  As I have been pushing hard for faculty in my school to get more engaged with issues surrounding digital pedagogy and to consider working with students to develop projects in digital humanities, I have been extremely aware that the biggest challenge is not faculty resistance or a lack of faculty curiosity–though there is sometimes that.  The biggest challenge is the simple fact of a lack of faculty time.  At a small teaching college our lives are full, and not always in a good way.  There is extremely little bandwidth to imagine or think through new possibilities, much less experiment with them.

At our end of the year school meeting I posed to faculty the question of what they had found to be the biggest challenge in the past year so that we could think through what to do about it in the future.  Amy, my faculty member who may be the most passionate about trying out the potential of technology in the classroom responded “No time to play.”  Amy indicated that she had bought ten new apps for her iPad that year, but had not had any time to just sit around and experiment with them in order to figure out everything that they could do and imagine new possibilities for her classroom and the rest of her work.  The need for space, the need for play, is necessary for the imagination, for learning, and for change.

It is necessary for excellence, but it is easily the thing we value least in higher education.

Discussing this same issue with my department chairs, one of them said that she didn’t really care how much extra money I would give them to do work on digital humanities and pedagogy, what she really needed was extra time.

This is, I think, a deep problem generally and a very deep problem at a teaching college with a heavy teaching load and restricted budgets.  (At the same time, I do admit that I recognize some of the biggest innovations in digital pedagogy have come from community colleges with far higher teaching loads than ours).  I think, frankly, that this is at the root of some of the slow pace of change in higher education generally. Faculty are busy people despite the stereotype of the professor with endless time to just sit around mooning about nothing.  And books are….simple.  We know how to use them, they work pretty well, they are standardized in terms of their technical specifications, and we don’t have to reinvent the wheel every time we buy one.

Not so with the gadgets, gizmos, and applications that we accumulate rapidly with what Ryan describes as “techno-lust”. (I have not yet been accused of having this, but I am sure someone will use it on me now).  Unless driven by a personal passion, I think most faculty and administrators make an implicit and not irrational decision–“This is potentially interesting, but it would be just one more thing to do.”  This problem is exacerbated by the fact that the changes in technology seem to speed up and diversify rather than slow down and focus.  Technology doesn’t seem to simplify our lives or make them easier despite claims to greater efficiency.  Indeed, in the initial effort to just get familiar or figure out possibilities, technology just seems to add to the clutter.

I do not know a good way around this problem:  the need for play in an overstuffed and frantic educational world that so many of us inhabit.  One answer–just leave it alone and not push for innovation–doesn’t strike me as plausible in the least.  The world of higher education and learning is shifting rapidly under all of our feet, and the failure to take steps to address that change creatively will only confirm the stereotype of higher education as a dinosaur unable to respond to the educational needs of a public.

I’m working with the Provost so see if I can pilot a program that would give a course release to a faculty member to develop his or her abilities in technology in order to redevelop a class or develop a project with a student.  But this is a very small drop in the midst of a very big bucket of need.  And given the frantic pace of perpetual change that seems to be characteristic of contemporary technology, it seems like the need for space to play, and the lack of it, is going to be a perpetual characteristic of our personal professional economies for a very long time to come.

Any good ideas?  How can could I make space for professional play in the lives of faculty? Or for that matter in my own? How could faculty do it for themselves?  Is there a means of decluttering our professional lives to make genuine space for something new?

Why students of the Humanities should look for jobs in Silicon Valley

Ok, I’ll risk sounding like a broken record to say again that the notion that humanities students are ill-positioned for solid careers after college is simply misguided.  It still bears repeating.  This latest from Vivek Wadhwa at the Washington Post gives yet more confirmation of the notion that employers are not looking for specific majors but for skills and abilities and creativity, and that package can come with any major whatsoever, and it often comes with students in the humanities and social sciences.

Using Damon Horowitz, who possess degrees in both philosophy and engineering and whose unofficial title at Google is In-House Philosopher and whose official title is Director of Engineering, Wadhwa points out the deep need for humanities and social science students in the work of technology companies, a need that isn’t just special pleading from a humanist but is made vivid in the actual hiring practices of Silicon Valley companies.

Venture Capitalists often express disdain for startup CEOs who are not engineers. Silicon Valley parents send their kids to college expecting them to major in a science, technology, engineering or math (STEM) discipline. The theory goes as follows: STEM degree holders will get higher pay upon graduation and get a leg up in the career sprint.

The trouble is that theory is wrong. In 2008, my research team at Duke and Harvard surveyed 652 U.S.-born chief executive officers and heads of product engineering at 502 technology companies. We found that they tended to be highly educated: 92 percent held bachelor’s degrees, and 47 percent held higher degrees. But only 37 percent held degrees in engineering or computer technology, and just two percent held them in mathematics. The rest have degrees in fields as diverse as business, accounting, finance, healthcare, arts and the humanities.

Yes, gaining a degree made a big difference in the sales and employment of the company that a founder started. But the field that the degree was in was not a significant factor. ….

I’d take that a step further. I believe humanity majors make the best project managers, the best product managers, and, ultimately, the most visionary technology leaders. The reason is simple. Technologists and engineers focus on features and too often get wrapped up in elements that may be cool for geeks but are useless for most people. In contrast, humanities majors can more easily focus on people and how they interact with technology. A history major who has studied the Enlightment or the rise and fall of the Roman Empire may be more likely to understand the human elements of technology and how ease of use and design can be the difference between an interesting historical footnote and a world-changing technology. 

via Why Silicon Valley needs humanities PhDs – The Washington Post.

Again, at the risk of sounding like a broken record, this sounds like the kind of findings emphasized at the Rethinking Success Conference that I have now blogged on several times.    (I’ve heard theories that people come to be true believers if they hear a story 40 times.  So far I’ve only blogged on this 12 times, so I’ll keep going for a while longer).  Although I still doubt that it would be a good thing for a philosopher to go to Silicon Valley with no tech experience whatsoever,  a philosopher who had prepared himself by acquiring some basic technical skills alongside of his philosophy degree might be in a particularly good position indeed.  Worth considering.

Side note,  the Post article points to a nice little bio about Damon Horowitz.  I suspect there are not many folks in Silicon Valley who can talk about the ethics of tech products in terms that invoke Kant and John Stuart Mill.  Maybe there should be more.

Are Writers Afraid of the Dark–Part II: Salman Rushdie’s contradictory views of censorship

A brief follow up on my post from earlier today responding to Tim Parks’s notion over at the New York Review of Books that literature is actually characterized by fear and withdrawal from life rather than engagement with us.  Later in the day I read Salman Rushdie’s post at the New Yorker on Censorship, a redaction of his Arthur Miller Freedom to Write Lecture delivered a few days ago.  Rushdie brings out the idea that, indeed, writers can be afraid, but it is a fear born from the fact of their writing rather than their writing being a compensation for it.  Censorship is a direct attack on the notion of the right to think and write and Rushdie brings out the idea that this can be paralyzing to the writers act.

The creative act requires not only freedom but also this assumption of freedom. If the creative artist worries if he will still be free tomorrow, then he will not be free today. If he is afraid of the consequences of his choice of subject or of his manner of treatment of it, then his choices will not be determined by his talent, but by fear. If we are not confident of our freedom, then we are not free.

via Salman Rushdie’s PEN World Voices Lecture on Censorship : The New Yorker.

Rushdie goes on to chronicle the martyrs of writing, who have had a great deal to be afraid of because of their writing (a point made in responses to Parks’s blog as well).

You will even find people who will give you the argument that censorship is good for artists because it challenges their imagination. This is like arguing that if you cut a man’s arms off you can praise him for learning to write with a pen held between his teeth. Censorship is not good for art, and it is even worse for artists themselves. The work of Ai Weiwei survives; the artist himself has an increasingly difficult life. The poet Ovid was banished to the Black Sea by a displeased Augustus Caesar, and spent the rest of his life in a little hellhole called Tomis, but the poetry of Ovid has outlived the Roman Empire. The poet Mandelstam died in one of Stalin’s labor camps, but the poetry of Mandelstam has outlived the Soviet Union. The poet Lorca was murdered in Spain, by Generalissimo Franco’s goons, but the poetry of Lorca has outlived the fascistic Falange. So perhaps we can argue that art is stronger than the censor, and perhaps it often is. Artists, however, are vulnerable.

Read more http://www.newyorker.com/online/blogs/books/2012/05/on-censorship-salman-rushdie.html#ixzz1vMorpS00

This is powerful stuff, though I’ll admit I started feeling like there was an uncomfortable contradiction in Rushdie’s presentation.  Although Rushdie’s ostensible thesis is that “censorship is not good for art,” he goes on after this turn to celebrate the dangerousness of writing.  According to Rushdie, all great art challenges the status quo and unsettles convention:

Great art, or, let’s just say, more modestly, original art is never created in the safe middle ground, but always at the edge. Originality is dangerous. It challenges, questions, overturns assumptions, unsettles moral codes, disrespects sacred cows or other such entities. It can be shocking, or ugly, or, to use the catch-all term so beloved of the tabloid press, controversial. And if we believe in liberty, if we want the air we breathe to remain plentiful and breathable, this is the art whose right to exist we must not only defend, but celebrate. Art is not entertainment. At its very best, it’s a revolution.

Read more http://www.newyorker.com/online/blogs/books/2012/05/on-censorship-salman-rushdie.html#ixzz1vMpu97Bt

It remains unclear to me how Rushdie can have it both ways.  If Art is going to be revolutionary, it cannot possibly be safe and it cannot possibly but expect the efforts to censor.  If there is no resistance to art, then there is no need for revolution, everything will be the safe middle ground, and there will be no possibility of great art.

I am not sure of which way Rushdie wants it, and I wonder what my readers think.  Does great art exist apart from resistance and opposition?  If it does not, does it make sense to long for a world in which such opposition does not exist?  Does Rushdie want to be edgy and pushing boundaries, but to do so safely?  Is this a contradictory and impossible desire?

You can also listen to Rushdie’s lecture below:

Are writers afraid of the dark?

In a new blog at NYRB, Tim Parks questions the notion that literature is about the stuff of life and instead might be a kind of withdrawal from the complexity and fearfulness of life itself:

So much, then, for a fairly common theme in literature. It’s understandable that those sitting comfortably at a dull desk to imagine life at its most intense might be conflicted over questions of courage and fear. It’s also more than likely that this divided state of mind is shared by a certain kind of reader, who, while taking a little time out from life’s turmoil, nevertheless likes to feel that he or she is reading courageous books.

The result is a rhetoric that tends to flatter literature, with everybody over eager to insist on its liveliness and import. “The novel is the one bright book of life,” D H Lawrence tells us. “Books are not life,” he immediately goes on to regret. “They are only tremulations on the ether. But the novel as a tremulation can make the whole man alive tremble.” Lawrence, it’s worth remembering, grew up in the shadow of violent parental struggles and would always pride himself on his readiness for a fight, regretting in one letter that he was too ill “to slap Frieda [his wife] in the eye, in the proper marital fashion,” but “reduced to vituperation.” Frieda, it has to be said, gave as good as she got. In any event words just weren’t as satisfying as blows, though Lawrence did everything he could to make his writing feel like a fight: “whoever reads me will be in the thick of the scrimmage,” he insisted.

In How Fiction Works James Wood tells us that the purpose of fiction is “to put life on the page” and insists that “readers go to fiction for life.” Again there appears to be an anxiety that the business of literature might be more to do with withdrawal; in any event one can’t help thinking that someone in search of life would more likely be flirting, traveling or partying. How often on a Saturday evening would the call to life lift my head from my books and have me hurrying out into the street.

(via Instapaper)

I was reminded in reading this of a graduate seminar with Franco Moretti wherein he said, almost as an aside, that we have an illusion that literature is complex and difficult, but that in fact, literature simplifies the complexity and randomness of life as it is.  In some sense literature is a coping mechanism.  I don’t remember a great deal more than that about the seminar–other than the fact that Moretti wasn’t too impressed with my paper on T.S. Eliot–but I do remember that aside.  It struck me as at once utterly convincing and yet disturbing, unsettling the notion that we in literature were dealing with the deepest and most complicated things in life.

On the other hand, I’m reminded of the old saw, literature may not be life, but, then, what is?  Parks seems to strike a little bit of a graduate studenty tone here in presenting the obvious as an earthshaking discovery, without really advancing our understanding of what literature might actually be and do.  Parks seems to take delight in skewering without revealing or advancing understanding.  There’s a tendency to set up straw men to light afire, and then strike the smug and knowing revelatory critical pose, when what one has revealed is more an invention of one’s own rhetoric than something that might be worth thinking about.

This desire to convince oneself that writing is at least as alive as life itself, was recently reflected by a New York Times report on brain-scan research that claims that as we read about action in novels the relative areas of the brain—those that respond to sound, smell, texture, movement, etc.—are activated by the words. “The brain, it seems,” enthuses the journalist, “does not make much of a distinction between reading about an experience and encountering it in real life; in each case, the same neurological regions are stimulated.”

What nonsense! As if reading about sex or violence in any way prepared us for the experience of its intensity. (In this regard I recall my adolescent daughter’s recent terror on seeing our border collie go into violent death throes after having eaten some poison in the countryside. As the dog foamed at the mouth and twitched, Lucy was shivering, weeping, appalled. But day after day she reads gothic tales and watches horror movies with a half smile on her lips.)

I’m tempted to say “What nonsense!”  Parks’s willingness to use his daughter to dismiss a scientific finding strikes me a bit like the homeschool student I once had who cited her father as an authority who disproved evolution.  Well.  The reference to the twitching dog invokes emotion that in fact runs away–in a failure of critical nerve perhaps?–from the difficult question of how exactly the brain processes and models fictional information, how that information relates to similar real world situations in which people find themselves, and how people might use and interrelate both fictional and “real world” information.

Parks seems to have no consciousness whatsoever of the role of storytelling in modeling possibility, one of its most complex ethical and psychological effects.  It’s a very long-standing and accepted understanding that one reason we tell any stories at all is to provide models for living.  Because a model is a model, we need not assume it lacks courage or is somehow a cheat on the real stuff of life.  Horror stories and fairy tales help children learn to deal with fear, impart warning and knowledge and cultural prohibitions to children, and attempt to teach them in advance how to respond to threat, to fear, to violence, etcetera.  Because those lessons are always inadequate to the moment itself hardly speaks against the need to have such mental models and maps.  It would be better to ask what we would do without them.  The writer who provides such models need not be skewered for that since to write well and convincingly, to provide a model that serves that kind of ethical or psychic purpose, the writer him or herself must get close to those feelings of terror and disintegration themselves.  It’s why there’s always been a tradition of writers like Hemingway or Sebastian Junger who go to war in order to get into that place within themselves where the emotions of the real can be touched.  It’s also why there’s always been a tradition of writers self-medicating with alcohol.

Thus, I kind of found Parks’s implied assumption that writers are cowering just a bit from the real stuff of life to be a cheap shot, something that in the cultural stories we tell each other is usually associated with cowardice and weakness, in a writer or a fighter.  The novelists and poets Parks takes on deserve better.