Dreaming of Sleep: Madison Smartt Bell’s Doctor Sleep

It’s a pleasure to pick up a novel and know from the first lines that it will be worth the read. We usually have to give more of ourselves over to novels than to other forms of writing–a problem in our frantic clickable age. With stuff that I remand to my Instapaper account, I can glance through the first paragraph and decide if I want to keep reading, and if I read three or four paragraphs and am not convinced it’s worth the time, there’s no point in agonizing about whether to keep on going. Same with a book of poetry: If I read the first three or four poems and there’s nothing there other than the authors reputation, I mostly put it aside–though I might admit that it is my failing and not the poets.

But novels are a different horse. A lot of novels require 50 pages and sometime more before we can get fully immersed in the writer’s imaginative world, feeling our way in to the nuances and recesses of possibility, caring enough about the characters or events or language or ideas (and preferably all four) to let the writer land us like netted fish. I think I’ve written before about the experience of reading yet one more chapter, still hoping and believing in the premise or the scenario or the author’s reputation. I’ve finished books that disappointed me, though my persistence was more like a teenaged boy fixed up on a date with with the best girl in school, pretending until the evening clicks to a close that he isn’t really bored to tears, things just haven’t gotten started yet.

But Madison Smartt Bell’s Doctor Sleep didn’t make me wait. I bought the book on reputation and topic. I loved Bell’s All Souls Rising, but got derailed by the follow-ups in his Haitian trilogy, never quite losing myself in the Caribbean madness that made the first book the deserved winner of an array of awards. Thus disappointed, I hadn’t really picked up Bell’s work since, though I vaguely felt I ought to. From the first sentences of the novel, his first, I was under the spell. The choice of words is purposeful since the book is about a hypnotherapist who, while helping others solve all manner of problems and difficulties through his particular gift at putting them under, cannot neither solve his own problems or put himself to sleep: He suffers from a crippling case of insomnia.

Like any good novel, the meanings are thickly layered. In some respects I found myself thinking of the Apostle Paul’s dictum that wretched man that he was, he knew what he should do, and he wanted to do it, but he could not do the very thing he knew to do, and, indeed, the very thing he did not want to do this was the very thing he did. The tale of all things human, the disjunction between knowledge and will, between thought and desire and act. The main character’s skills as a hypnotist are deeply related to his metaphysical wanderings amidst the mystics and heretics of the past, most particularly Giordano Bruno, burned at the stake because he claimed the church sought to promote good through force rather than through love. Ironically, the main character knows all this, knows in his head that love is the great mystic unity of which the mystics speak, and yet turns away from love in to abstraction, failing to love women because he cannot see them as human beings to whom he might be joined, seeing them instead as mystic abstractions through which he wants to escape the world.

In the end, accepting love means accepting death, which means accepting sleep–something that seems so natural to so many, but if you have suffered from insomnia as I do, you realize that surrendering to sleep is a strange act of grace, one that cannot be willed, but can only be received.

I think in some ways to there’s a lot of reflection in this book on the power of words and stories, their ability to put us under. So, perhaps inevitably, it is a book about writing and reading on some very deep level. Adrian, Doctor Sleep, takes people on a journey in to their unconscious through words, and his patients surrender to him willingly. Indeed, Adrian believes, with most hypnotists, that only those who want to be hypnotized actually can be. This is not so far from the notion of T.S. Eliot’s regarding the willing suspension of disbelief. I do not believe Adrian’s metaphysical mumbo-jumbo, but for the space of the novel I believe it utterly. We readers want to be caught. We want to lose ourselves at least for that space and that time, so that reading becomes a little like the gift of sleep, a waking dream.

Under the spell of writing we allow Bell to take us in to another world that is, surprisingly, like our own, one in which we see our own abstractedness, our own anxieties, our own petty crimes and misdemeanors, our own failures to love.

Dumpster Diving and other career moves: remembering the job market with Roger Whitson

It would be hard to say I enjoyed reading Roger Whitson’s very fine recent meditation in the Chronicle on the quest for a tenure-track job, his ambivalent feelings on finding one, the mixed feelings of exaltation and guilt at getting what so many of his peers would never find, and of leaving behind an #Altac existence where he had begun to make a home.

Hard to enjoy reading both because the story seems to typify what our academic life has actually become, and, frankly, because it reminded me too much of my own wandering years as a new academic a couple of decades ago.  I spent seven years full-time on the job market back in the day (if you count the last two years of graduate school).  I have estimated in the past that I must have applied for at least 700-800 jobs during those years–the idea of being targeted and selective a joke for a new father.  Fortunately I was actually only totally unemployed for four months during those years, though that was enough to plunge me thousands of dollars in to debt paying for health insurance.  For five of those seven years I had full-time work in various visiting assistant positions, and for two of  those visiting years I was paid so little I qualified for food stamps, though I never applied for the program.  I worked as many extra courses as I could to pay the bills–probably foolish for my career since publishing slowed to a crawl, but it saved my pride.  I remember asking, naively, during an interview for one such visiting position whether it was actually possible to live in that area of the country on what I was going to be paid.  The chair interviewing me at the time hesitated, then responded, “Well, of course, your wife can work.”

Only one of those years did I not get an interview, and only two of those years did I not get a campus interview, but even then this seemed like a very peculiar and unhelpful way to claim success for a beginning academic career.  We did not have anything called #altac in those days, and my plan B–which on my worst days I sometimes still wonder whether I should have followed–was to go back to cooking school and become a chef (I know, I know.  Another growth industry).  I never felt bad about pursuing a PhD in English, and I don’t think I would have even if I had gone on to become a chef.  The learning was worth it, to me, at least.

But I did grow distant from college friends who became vice-presidents of companies or doctors in growing practices , all of whom talked about their mortgages and vacations in the Caribbean or Colorado, while I was living in the cheapest 2 bedroom apartment in Fairfax Virginia that I could find and fishing furniture, including my daughter’s first bed, out of a dumpster.  (The furniture was held together, literally, by duct tape; I had to pay for conferences). And I spent a lot of evenings walking off my anxiety through the park next to our apartment complex, reminding myself of how much I had to be thankful for.  After all, I had a job and could pay my bills through the creative juggling of credit card balances. A lot of my friends had found no jobs at all.  A low rent comparison, I realize, but I would take what solace I could get.

I do not resent those days now, but that depends a lot on my having come out the other side.  The sobering thought in all of this is in realizing that in the world of academics today I should count myself one of the lucky ones.  Reading Roger’s essay, and the many like it that have been published in the last twenty years, I always get a sick hollow feeling in the gut, remembering what it was like to wonder what would happen if….

Reading Roger’s essay I was struck again with the fact that this is now the permanent condition of academic life in the humanities.  My own job story began more than 20 years ago at Duke, and even then we were told that the job market had been miserable for 15 years (but was sure to get better by and by).  30 years is not a temporary downturn or academic recession.  It is a way of being.

The advent of MOOC’s, all-online education, and for-profit universities, are responses to the economics of higher education that are unlikely to make things any better for the freshly minted PhD.  While there are some exciting innovations here that have a lot of promise for increasing learning to the many, it’s also the case that they are attractive and draw interest because they promise to do it more cheaply, which in the world of higher education means teaching more students with fewer faculty hours.  Roger’s most powerful line came toward the end:  “Until we realize that we are all contingent, we are all #altac, we all need to be flexible, and we are all in this together, we won’t be able to effectively deal with the crisis in the humanities with anything other than guilt.”

This is right, it seems to me.  In a world that is changing as rapidly and as radically as higher education, we are all as contingent the reporters and editors in the newsrooms of proud daily newspapers.  It is easy to say that the person who “made it” was talented enough or smart enough or savvy enough, but mostly they, I, we were just lucky enough to come out the other side.  But we would be misguided to imagine that because we made it in to a world that at least resembled the world we imagined, that that world will always be there.  We are an older institution and industry than music or radio or newspapers, but we are an industry and an institution nonetheless, and it seems to me that the change is upon us.  We are all contingent now.

The Best of Times, the Worst of Times: The U of Missouri Press is closing, Jennifer Egan is Tweeting

A bad week for publishing, but sometimes it seems like they all are.  First I was greeted with the news that the New Orleans Times-Picayune has cut back circulation to three days a week.  Apparently later in the same day, three papers in Alabama announced a similar move to downsize and reduce circulation.  Apparently being an award winning newspaper that does heroic community service in the midst of the disaster of a century is no longer enough.

Then today’s twitter feed brought me news of another University Press closing.

University of Missouri Press is closing after more than five decades of operation, UM System President Tim Wolfe announced this morning.

The press, which publishes about 30 books a year, will begin to be phased out in July, although a more specific timeline has not been determined.

Ten employees will be affected. Clair Willcox, editor in chief, declined to comment but did note that neither he nor any of the staff knew about the change before a midmorning meeting.

In a statement, Wolfe said even though the state kept funding to the university flat this year, administrators “take seriously our role to be good stewards of public funds, to use those funds to achieve our strategic priorities and re-evaluate those activities that are not central to our core mission.”

via University of Missouri Press is closing | The Columbia Daily Tribune – Columbia, Missouri.

Plenty has been said about the worrisome demise of daily papers and what the transformation of journalism into an online blogosphere really means for the body politic.  Will the Huffington Post, after all, actually cover anything in New Orleans if the paper goes under entirely.  Reposting is still not reporting, and having opinions at a distance is great fun but not exactly a form of knowledge.

The demise of or cutbacks to university presses is less bemoaned in the national press or blogosphere, but it is still worrisome.  Although I am now a believer in the possibilities of serious intellectual work occurring online, I am not yet convinced the demise of the serious scholarly book with a small audience would be a very good thing.  Indeed, I believe the best online work remains in a kind of symbiotic relationship with the traditional scholarly monograph or journal.  I keep my fingers crossed that this is merely an instance merely an instance of creative destruction, and not an instance of destruction plan and simple.

On a more hopeful note, I will say that I thoroughly enjoyed the New Yorker tweeting Jennifer Egan’s latest story Black box and am looking forward to the next installments.  I’d encourage everyone to “listen in”, if that’s what you do on twitter, but if you can’t you can read it in a more traditional but still twitterish form at the New Yorker Page turner site.  To get the twitter feed, go to @NYerFiction.  The reviews have been mixed, but I liked it a great deal.  Egan is a great writer, less full of herself than some others, she has a great deal to say, and she’s willing to experiment with new ways to say it.  Her last novel, Waiting for the Goon Squad, experimented with Powerpoint like slides within the text.  And, there’s a nice article over at Wired about the piece, suggesting it may be signaling a revival of serialized fiction.

Let’s hope so, it will make up for the loss of the U of MIssouri Press, at least today.

Blogging as textual meditation: Joyce Carol Oates and The Paris Review

One of the surprise pleasures afforded by Twitter has been following The Paris Review (@parisreview) and getting tweets linking me to their archives of author interviews.  I know I could just go to the website, but it feels like a daily act of grace to run across the latest in my twitter feed, as if these writers are finding me in the ether rather than me searching for them dutifully.

(In my heart of hearts I am probably still a Calvinist;  the serendipity of these lesser gods finding me is so much better than the tedious duty of seeking them out).

This evening over dinner I read the latest, a 1978 interview with Joyce Carol Oates, a real gem.

INTERVIEWER

Do you find emotional stability is necessary in order to write? Or can you get to work whatever your state of mind? Is your mood reflected in what you write? How do you describe that perfect state in which you can write from early morning into the afternoon?

OATES

One must be pitiless about this matter of “mood.” In a sense, the writing will create the mood. If art is, as I believe it to be, a genuinely transcendental function—a means by which we rise out of limited, parochial states of mind—then it should not matter very much what states of mind or emotion we are in. Generally I’ve found this to be true: I have forced myself to begin writing when I’ve been utterly exhausted, when I’ve felt my soul as thin as a playing card, when nothing has seemed worth enduring for another five minutes . . . and somehow the activity of writing changes everything. Or appears to do so. Joyce said of the underlying structure of Ulysses—the Odyssean parallel and parody—that he really didn’t care whether it was plausible so long as it served as a bridge to get his “soldiers” across. Once they were across, what does it matter if the bridge collapses? One might say the same thing about the use of one’s self as a means for the writing to get written. Once the soldiers are across the stream . . .

via Paris Review – The Art of Fiction No. 72, Joyce Carol Oates.

Oates doesn’t blog, I think, and I wouldn’t dare to hold my daily textural gurgitations up next to Oates’s stupendous artistic outpouring.  On the other hand, I resonated with this, thinking about what writing does for me at the end of the day.  I’ve had colleagues ask me how I have the time to write every day, my sometimes longish diatribes about this or that subject that has caught my attention.  Secretly my answer is “How could I not?”

Ok, I know that for a long time this blog lay fallow, but I have repented of that and returned to my better self. Mostly (tonight is an exception), I do my blog late, after 10:00–late for someone over 50–like a devotion.  I just pick up something I’ve read that day, like Joyce Carol Oates, and do what English majors are trained to do:  find a connection.  Often I’m exhausted and cranky from the day–being an administrator is no piece of cake,( but then, neither is being alive so what do I have to complain about).  Mostly I write as if I were talking to someone about the connections that I saw, the problems that it raised (or, more rarely, solved).

It doesn’t take that long–a half hour to an hour, and mostly I’ve given up television entirely.  I tell people I seem to think in paragraphs–sometimes very bad paragraphs, but paragraphs nevertheless–and years of piano lessons have left me a quick typist.  Sometimes I write to figure out what I think, sometimes to figure out whether what I think matters, sometimes to resolve a conundrum I have yet to figure out at work or at home, sometimes to make an impression (I am not above vanity).

But always I write because the day and my self disappears.  As Oates says above,  the activity of writing changes everything, or at least appears to  do so. Among the everything that it changes is me.  I am most myself when I lose the day and myself in words.

Yesterday on Facebook I cited Paul Fussell saying  “If I didn’t have writing, I’d be running down the street hurling grenades in people’s faces.”

Well, though I work at a pacifist school and it is incorrect to say so, that seems about right.

Digital Humanities, “techno-lust”, and the Personal Economy of Professional Development: Ryan Cordell on simplification

It’s a sign of my unsimple and exceedingly overstuffed life that I’ve only now gotten around to reading Ryan Cordell’s ProfHacker piece from last week.  Ryan is moving to a new position at Northeastern (kudos!) and he’s taken the ritual of eliminating the clutter and junk of a household as a metaphor for the need to prioritize and simplify our professional practices and in his own instance to get rid of the techno gadgets and gizmos that curiosity and past necessity have brought his way.

I have confessed before my appreciation for Henry David Thoreau—an odd thinker, perhaps, for a ProfHacker to esteem. Nevertheless, I think Thoreau can be a useful antidote to unbridled techno-lust. As I wrote in that earlier post, “I want to use gadgets and software that will help me do things I already wanted to do—but better, or more efficiently, or with more impact.” I don’t want to acumulate things for their own sake.

…..

I relate this not to brag, but to start a conversation about necessity. We talk about tech all the time here at ProfHacker. We are, most of us at least, intrigued by gadgets and gizmos. But there comes a time to simplify: to hold a garage sale, sell used gadgets on Gazelle, or donate to Goodwill.

via Simplify, Simplify! – ProfHacker – The Chronicle of Higher Education.

Ryan’s post comes as I’ve been thinking a lot about the personal economy that we all must bring to the question of our working lives, our sense of personal balance, and our personal desire for professional development and fulfillment.  As I have been pushing hard for faculty in my school to get more engaged with issues surrounding digital pedagogy and to consider working with students to develop projects in digital humanities, I have been extremely aware that the biggest challenge is not faculty resistance or a lack of faculty curiosity–though there is sometimes that.  The biggest challenge is the simple fact of a lack of faculty time.  At a small teaching college our lives are full, and not always in a good way.  There is extremely little bandwidth to imagine or think through new possibilities, much less experiment with them.

At our end of the year school meeting I posed to faculty the question of what they had found to be the biggest challenge in the past year so that we could think through what to do about it in the future.  Amy, my faculty member who may be the most passionate about trying out the potential of technology in the classroom responded “No time to play.”  Amy indicated that she had bought ten new apps for her iPad that year, but had not had any time to just sit around and experiment with them in order to figure out everything that they could do and imagine new possibilities for her classroom and the rest of her work.  The need for space, the need for play, is necessary for the imagination, for learning, and for change.

It is necessary for excellence, but it is easily the thing we value least in higher education.

Discussing this same issue with my department chairs, one of them said that she didn’t really care how much extra money I would give them to do work on digital humanities and pedagogy, what she really needed was extra time.

This is, I think, a deep problem generally and a very deep problem at a teaching college with a heavy teaching load and restricted budgets.  (At the same time, I do admit that I recognize some of the biggest innovations in digital pedagogy have come from community colleges with far higher teaching loads than ours).  I think, frankly, that this is at the root of some of the slow pace of change in higher education generally. Faculty are busy people despite the stereotype of the professor with endless time to just sit around mooning about nothing.  And books are….simple.  We know how to use them, they work pretty well, they are standardized in terms of their technical specifications, and we don’t have to reinvent the wheel every time we buy one.

Not so with the gadgets, gizmos, and applications that we accumulate rapidly with what Ryan describes as “techno-lust”. (I have not yet been accused of having this, but I am sure someone will use it on me now).  Unless driven by a personal passion, I think most faculty and administrators make an implicit and not irrational decision–“This is potentially interesting, but it would be just one more thing to do.”  This problem is exacerbated by the fact that the changes in technology seem to speed up and diversify rather than slow down and focus.  Technology doesn’t seem to simplify our lives or make them easier despite claims to greater efficiency.  Indeed, in the initial effort to just get familiar or figure out possibilities, technology just seems to add to the clutter.

I do not know a good way around this problem:  the need for play in an overstuffed and frantic educational world that so many of us inhabit.  One answer–just leave it alone and not push for innovation–doesn’t strike me as plausible in the least.  The world of higher education and learning is shifting rapidly under all of our feet, and the failure to take steps to address that change creatively will only confirm the stereotype of higher education as a dinosaur unable to respond to the educational needs of a public.

I’m working with the Provost so see if I can pilot a program that would give a course release to a faculty member to develop his or her abilities in technology in order to redevelop a class or develop a project with a student.  But this is a very small drop in the midst of a very big bucket of need.  And given the frantic pace of perpetual change that seems to be characteristic of contemporary technology, it seems like the need for space to play, and the lack of it, is going to be a perpetual characteristic of our personal professional economies for a very long time to come.

Any good ideas?  How can could I make space for professional play in the lives of faculty? Or for that matter in my own? How could faculty do it for themselves?  Is there a means of decluttering our professional lives to make genuine space for something new?

Why students of the Humanities should look for jobs in Silicon Valley

Ok, I’ll risk sounding like a broken record to say again that the notion that humanities students are ill-positioned for solid careers after college is simply misguided.  It still bears repeating.  This latest from Vivek Wadhwa at the Washington Post gives yet more confirmation of the notion that employers are not looking for specific majors but for skills and abilities and creativity, and that package can come with any major whatsoever, and it often comes with students in the humanities and social sciences.

Using Damon Horowitz, who possess degrees in both philosophy and engineering and whose unofficial title at Google is In-House Philosopher and whose official title is Director of Engineering, Wadhwa points out the deep need for humanities and social science students in the work of technology companies, a need that isn’t just special pleading from a humanist but is made vivid in the actual hiring practices of Silicon Valley companies.

Venture Capitalists often express disdain for startup CEOs who are not engineers. Silicon Valley parents send their kids to college expecting them to major in a science, technology, engineering or math (STEM) discipline. The theory goes as follows: STEM degree holders will get higher pay upon graduation and get a leg up in the career sprint.

The trouble is that theory is wrong. In 2008, my research team at Duke and Harvard surveyed 652 U.S.-born chief executive officers and heads of product engineering at 502 technology companies. We found that they tended to be highly educated: 92 percent held bachelor’s degrees, and 47 percent held higher degrees. But only 37 percent held degrees in engineering or computer technology, and just two percent held them in mathematics. The rest have degrees in fields as diverse as business, accounting, finance, healthcare, arts and the humanities.

Yes, gaining a degree made a big difference in the sales and employment of the company that a founder started. But the field that the degree was in was not a significant factor. ….

I’d take that a step further. I believe humanity majors make the best project managers, the best product managers, and, ultimately, the most visionary technology leaders. The reason is simple. Technologists and engineers focus on features and too often get wrapped up in elements that may be cool for geeks but are useless for most people. In contrast, humanities majors can more easily focus on people and how they interact with technology. A history major who has studied the Enlightment or the rise and fall of the Roman Empire may be more likely to understand the human elements of technology and how ease of use and design can be the difference between an interesting historical footnote and a world-changing technology. 

via Why Silicon Valley needs humanities PhDs – The Washington Post.

Again, at the risk of sounding like a broken record, this sounds like the kind of findings emphasized at the Rethinking Success Conference that I have now blogged on several times.    (I’ve heard theories that people come to be true believers if they hear a story 40 times.  So far I’ve only blogged on this 12 times, so I’ll keep going for a while longer).  Although I still doubt that it would be a good thing for a philosopher to go to Silicon Valley with no tech experience whatsoever,  a philosopher who had prepared himself by acquiring some basic technical skills alongside of his philosophy degree might be in a particularly good position indeed.  Worth considering.

Side note,  the Post article points to a nice little bio about Damon Horowitz.  I suspect there are not many folks in Silicon Valley who can talk about the ethics of tech products in terms that invoke Kant and John Stuart Mill.  Maybe there should be more.

Digital Humanities as Culture Difference: Adeline Koh on Hacking and Yacking

My colleague Bernardo Michael in the History department here has been pressing me to understand that properly understood Digital Humanities should be deeply connected to our College-wide efforts to address questions of diversity and what the AAC&U calls inclusive excellence.  (Bernardo also serves as the special assistant to the President for Diversity affairs).  At first blush I will admit that this has seemed counter-intuitive to me and I have struggled to articulate the priority between my interest in developing new efforts in Digital Humanities that I tie to our college’s technology plan and my simultaneous concerns with furthering our institutions diversity plan (besides just a general ethical interest, my primary field of study over the past 20 years has been multicultural American Literature).

Nevertheless, I’ve started seeing more and more of Bernardo’s point as I’ve engaged in the efforts to get things started in Digital Humanities.  For one thing, the practices and personages of the digital world are talked about in cultural terms:  We use language like “digital natives” and “digital culture” and “netizens”–cultural terms that attempt to articulate new forms of social and cultural being.  In the practical terms of trying to create lift-off for some of these efforts, an administrator faces the negotiation of multiple institutional cultures, and the challenging effort to get faculty–not unreasonably happy and proud about their achievements within their own cultural practices–to see that they actually need to become conversant in the languages and practices of an entirely different and digital culture.

Thus I increasingly see that Bernardo is right;  just as we need to acclimate ourselves and become familiar with other kinds of cultural differences in the classroom, and just as our teaching needs to begin to reflect the values of diversity and global engagement, our teaching practices also need to engage students as digital natives.  Using technology in the classroom or working collaboratively with students on digital projects isn’t simply instrumental–i.e. it isn’t simply about getting students familiar with things they will need for a job.  It is, in many ways, about cultural engagement, respect, and awareness.  How must our own cultures within academe adjust and change to engage with a new and increasingly not so new culture–one that is increasingly central and dominant to all of our cultural practices?

Adeline Koh over at Richard Stockton College (and this fall at Duke, I think), has a sharp post on these kinds of issues, focusing more on the divide between theory and practice or yacking and hacking in Digital Humanities.  Adeline has more theory hope than I do, but I like what she’s probing in her piece and I especially like where she ends up:

If computation is, as Cathy N. Davidson (@cathyndavidson) and Dan Rowinski have been arguing, the fourth “R” of 21st century literacy, we very much need to treat it the way we already do existing human languages: as modes of knowledge which unwittingly create cultural valences and systems of privilege and oppression. Frantz Fanon wrote in Black Skin, White Masks: “To speak a language is to take on a world, a civilization.”  As Digital Humanists, we have the responsibility to interrogate and to understand what kind of world, and what kind of civilization, our computational languages and forms create for us. Critical Code Studies is an important step in this direction. But it’s important not to stop there, but to continue to try to expand upon how computation and the digital humanities are underwritten by theoretical suppositions which still remain invisible.

More Hack, Less Yack?: Modularity, Theory and Habitus in the Digital Humanities | Adeline Koh
http://www.adelinekoh.org/blog/2012/05/21/more-hack-less-yack-modularity-theory-and-habitus-in-the-digital-humanities/

I suspect that Bernardo and Adeline would have a lot to say to each other.