On being human: Lessons from Harvard Institute for Management and Leadership in Higher Education

In a specific sense I am an unabashed advocate of what has come to be called the applied humanities, roughly and broadly speaking the effort to connect the study of the humanities to identifiable and practical social goods.  For me, in addition it includes the effort to develop humanities programs that take seriously that we are responsible (at least in significant part) for preparing our students for productive lives after college, preparation that I think really should be embedded within humanities curricula, advising, cocurricular programming, and the general ethos and rhetoric that we use to inculcate in our students what it means to be a humanist.

In several respects this conviction lies at the root of my advocacy for both digital humanities programs and for career planning and programming for liberal arts students, as different as these two areas seem to be on the surface.  I have little room left any more for the idea that “real learning” or intellectual work pulls up its skirts to avoid the taint of the marketplace or the hurly-burly of political arenas and that we demonstrate the transcendent value of what we do over and above professional programs by repeatedly demonstrating our irrelevance.  Far from diminishing the humanities, an insistence that what we do has direct and indirect, obvious and not so obvious connections to social value enhances the humanities.  It’s not just a selling point to a doubting public.  As I said yesterday, the only good idea is the idea that can be implemented.  We ought to be proud of the fact that we can show directly how our students succeed in life, how they apply the things they’ve learned, how they find practical ways of making meaningful connections between their academic study and the world of work.

At the same time, I will admit that some versions of this argument leave me cold.  It risks saying that the only thing that is really valuable about the humanities is what is practically relevant to the marketplace. I greet this effort to make Wordsworth a useful version of a management seminar with a queasy stomach.

It may sound like a nice day out in beautiful surroundings, but can walking around Lake District sites synonymous with Romantic poet William Wordsworth really offer business leaders and local entrepreneurs the crucial insights they need?

That is precisely the claim of Wordsworth expert Simon Bainbridge, professor of Romantic studies at Lancaster University, who believes the writer can be viewed as a “management guru” for the 21st century.

Since 2007, the scholar has taken students down into caves and out on canoes to the island on Grasmere once visited by Wordsworth and fellow poet Samuel Taylor Coleridge, and to places where many of the former’s greatest works were written, for what he called “practical exercises linked to the themes of Wordsworth’s poetry.”

Such walks, which also have been incorporated into development days for individual firms, are now being offered as a stand-alone option for local and social entrepreneurs at a rate of £175 ($274) a day.

Read more: http://www.insidehighered.com/news/2012/08/09/businesses-pay-british-professor-teach-them-about-wordsworth#ixzz236bQaECf 
Inside Higher Ed 

I do not find the insight here wrong so much as sad.  If the only reason we can get people to read Wordsworth is because he will enhance their management skills, we have somehow misplaced a priority, and misunderstood the role that being a manager ought to play in our lives and in the social and economic life of our society.  It is the apparent reduction of all things and all meaning to the marketplace that is to be objected to and which every educational institution worthy of the name ought to furiously resist, not the fact of marketplaces themselves.

I was lucky enough this summer to attend the Harvard Institute for Management and Leadership in Education.  To be honest, I went thinking I was going to get all kinds of advice on things like how to organize projects, how to manage budgets, how to promote programs, how to supervise personnel.  There was some of that to be sure, but what struck me most was that the Institute, under the leadership of Bob Kegan, put a high, even principal, priority on the notion that managers have to first take care of who they are as human beings if they are to be the best people they can be for their colleagues and their institutions.  You have to know your own faults and weakness, your own strengths, your dreams, and you have to have the imagination and strength of mind and heart (and body) to learn to attend to the gifts, and faults and dreams and nightmares of others before or at least simultaneously with your own.  In other words, being a better manager is first and foremost about becoming a healthier, more humane, fuller human being.

The tendency of some applied humanities programs to show the relevance of poetry by showing that it has insights in to management techniques, or the relevance of philosophy because it will help you write a better project proposal, is to misplace causes and to turn the human work of another imagination (in this case Wordsworth) into an instrumental opportunity.  The reason for reading Wordsworth, first and foremost, is because Wordsworth is worth reading, and simultaneously because the encounter with Wordsworth will give you the opportunity to be a fuller, more imaginative, more thoughtful human being than you were before.

If you become that, you will have a chance to be a better manager.  But even if you don’t become a better manager, or if you lose your job because your company is overtaken by Bain capital or because students no longer choose to afford your pricey education, you will, nonetheless, be richer.

Advertisements

Hermeneutics of the stack and the list: unreading journals in print or online

The New York Times reported yesterday that The Wilson Quarterly will put out its final print issue in July ( Wilson Quarterly to End Print Publication – NYTimes.com). The editorial staff seemed sanguine.

“We’re not going on the Web per se,” Steven Lagerfeld, the magazine’s editor, said in an interview. “We already have a Web site. The magazine will simply be published in a somewhat different form as an app,” first in the iTunes store and later on the Android platform.

And, to be honest, I’m sanguine too.  Although, I noted a the demise of the University of Missouri Press with a half shudder last week, I have to admit that I don’t greet the demise of print journals with the same anxiety.  I’ve recognized lately that I mostly buy paper journals so I can have access to their online manifestations or because I feel guilty knowing that online long form journalism and feature writing has yet to find a way to monetize itself effectively. I try to do my part by littering my office and bedroom with stack and stacks of largely unopened New York Reviews, New Yorkers, Chronicles of Higher Ed, and a few other lesser known magazines and specialist journals. But most of my non-book reading, long form or not, is done on my iPad.

I will leave the question of what we will do if good journals like the Wilson Quarterly really can’t survive on iTunes distribution (WQ only survived in paper because of the indulgence of the Woodrow Wilson International Center for Scholars). I’m more interested at the moment in the fact of the stack and what it signifies in the intellectual life.  Every intellectual I know of is guilty of stockpiling books and journals that  she never reads, and can never reasonably expect to, at least not if she has a day job.  The stack is not simply a repository of knowledge and intellectual stimulation beckoning to the reader, drawing him away from other mundane tasks like reading or preparing for class with an ennobling idea of staying informed. (Side note:  academia is the one place in life where every activity of daily life can be construed as tax deductible; just make a note about it and write “possible idea for future article” at the top of the page.)

No, The stack is also a signifier.  It exists not so much to read, since most academics give up hopelessly on the idea of reading every word of the journals that they receive.  The stack exists to be observed.  Observed on the one hand by the academic him or herself, a reassuring sign of one’s own seriousness, that one reads such thing and is conversant with the big ideas, or at least the nifty hot ideas, about culture high and low.  The stack also exists to be observed by others:  the rare student who comes by during office hours, the dean who happens to drop by to say hello, the colleagues coming in to ask you out for coffee–“Oh, you already got the latest issue of PMLA!” The stack suggests you are uptodate, or intend to be.  The stack communicates your values.  Which journal do you put strategically out at the edge of the desk to be observed by others, which do you stack heedlessly on top of the file cabinet.  Even the hopelessly disheveled office can signify, as did Derrida’s constantly disheveled hair; I am too busy and thinking too many big thoughts to be concerned with neatness.

The stack, like the Wilson Quarterly, is on its way out, at least for academics.  I realized four or five years ago that e-books would signify the end of a certain form of identification since people would no longer self-consciously display their reading matter in coffee houses or on subways, every text hidden in the anonymous and private cover of the Kindle or now the iPad.  While I could connect now with other readers in Tibet or Siberia, I could not say off-handedly to the college student sitting next to me–“Oh, you’re reading Jonathan Safran Foer, I loved that book!”

The stack too is going and will soon be gone.  Replaced now by the endless and endlessly growing list of articles on Instapaper that I pretend I will get back to.  This has not yet had the effect of neatening my office, but it will remove one more chance at self-display.  I will soon be accountable only for what I know and what I can actually talk about, not what I can intimate by the stacks of unread paper sitting on my desk.

Digital Archive as Advertisement: The Hemingway Papers

The pace at which digital material is being made available to the public and to students and scholars in the humanities is accelerating, whether one thinks of the digitization of books, the new MOOC’s from MIT and Harvard and others that will extend learning the humanities and other fields, or the digitization of papers and manuscripts that were previously in highly restricted manuscripts or rare book sections of single libraries like the James Joyce Papers just released in Ireland.

Another addition to this list is the release of a new digitized collection of Hemingway’s writings for the Toronto Star.  The Star has put together the columns written by Hemingway for the paper in the early 20s, along with some stories about the  writer.  I’m basically extremely happy that archives like this and others are taking their place in the public eye.  I had a great course on Hemingway while pursuing an MFA at the University of Montana with Gerry Brenner, and the legacy of Hemingway was felt everywhere.  Still is as far as I’m concerned.

At the same time, I admit that the Star site left me just a little queasy and raised a number of questions about what the relationship is between a commercial enterprise like the Star and digital work and scholarly work more generally.  First cue to me was the statement of purpose in the subtitle to the homepage:

The legendary writer’s reporting from the Toronto Star archives, featuring historical annotations by William McGeary, a former editor who researched Hemingway’s columns extensively for the newspaper, along with new insight and analysis from the Star’s team of Hemingway experts.

I hadn’t really realized that the Toronto Star was a center of Hemingway scholarship, but maybe I’ve missed something over the past 20 years.  Other similar statements emphasize the Star’s role in Hemingway’s life as much as anything about Hemingway himself:  emphases on the Star’s contributions to the great writer’s style (something that, if I remember, Hemingway himself connected more to his time in Kansas City), emphases on the way the Star nurtured the writer and on the jovial times Hemingway had with Star editorial and news staff.  Sounds a little more like a family album than a really serious scholarly take on what Hemingway was about in this period.  Indeed, there is even a straightforward and direct advertisement on the page as it sends you to the Toronto Star store where you can purchase newsprint editions of Hemingway’s columns.

I don’t really want to looks a gift horse in the mouth.  There’s a lot of good stuff here, and just having the articles and columns available may be enough and I can ignore the rest.  Nevertheless, the web is a framing device that makes material available within a particular context, and here that context clearly has a distinct commercial angle.  It strikes me that this is a version of public literary history that has all the problems of public history in general that my colleague John Fea talks about over at The Way of Improvement Leads Home.  Here of course it is not even really the public doing the literary history but a commercial enterprise that has a financial stake in making itself look good in light of Hemingways legacy.

The Star promises the site will grow, which is a good thing.  I hope it will grow in a way that will allow for more genuine scholarly engagement on Hemingways legacy as well as more potential interactivity.  The site is static with no opportunity for engagement at all, so everything is controlled by the Star and its team of Hemingway experts.  We take it or we leave it.

For the moment I am taking it, but I worry about the ways commercial enterprises can potentially shape our understanding of literary and cultural history for their own ends.  I wonder what others think about the role of commercial enterprises in establishing the context through which we think about literature and culture?

Barack Obama’s Waste Land; President as First Reader

GalleyCat reported today that the new biography of Barack Obama gives an extensive picture of Obama’s literary interests, including a long excerpt of a letter in which Obama details his engagement with TS Eliot and his signature poem, The Waste Land. Obama’s analysis:

Eliot contains the same ecstatic vision which runs from Münzer to Yeats. However, he retains a grounding in the social reality/order of his time. Facing what he perceives as a choice between ecstatic chaos and lifeless mechanistic order, he accedes to maintaining a separation of asexual purity and brutal sexual reality. And he wears a stoical face before this. Read his essay on Tradition and the Individual Talent, as well as Four Quartets, when he’s less concerned with depicting moribund Europe, to catch a sense of what I speak. Remember how I said there’s a certain kind of conservatism which I respect more than bourgeois liberalism—Eliot is of this type. Of course, the dichotomy he maintains is reactionary, but it’s due to a deep fatalism, not ignorance. (Counter him with Yeats or Pound, who, arising from the same milieu, opted to support Hitler and Mussolini.) And this fatalism is born out of the relation between fertility and death, which I touched on in my last letter—life feeds on itself. A fatalism I share with the western tradition at times.

A Portrait of Barack Obama as a Literary Young Man – GalleyCat.

For a 22 year old, you’d have to say this is pretty good. I’m impressed with the nuance of Obamas empathetic imagination, both in his ability to perceive the differences between the three great conservative poets of that age, and in his ability to identify with Eliot against his own political instincts. This is the kind of reading we’d like to inculcate in our students, and I think it lends credence to the notion that a mind trained in this kind of engagement might be better trained for civic engagement than those that are not. But too often even literature profs are primarily readers of the camp, so to speak, lumping those not of their own political or cultural persuasion into the faceless, and largely unread, camp of the enemy, and appreciating without distinction those who further our pet or current causes.

This is too bad, reducing a richer sense of education for civic engagement into the narrower and counterproductive sense of reading as indoctrination. I think the older notion was a vision of education that motivated the founding fathers. Whatever one thinks of his politics, passages like this suggest to me that Obama could sit unembarrassed with Jefferson and Adams discussing in all seriousness the relationship between poetry and public life. It would be a good thing to expect this of our presidents, rather than stumbling upon it by accident.

We are all twitterers now: revisiting John McWhorter on Tweeting

Angus Grieve Smith over at the MLA group on Linked In pointed me toward John McWhorter’s take on Twitter a couple of years back. [I admit to some embarrassment in referencing an article that’s TWO YEARS OLD!! But the presentism of the writing for the web is a story for another time]. McWhorter’s basic case is that Twitter is not really writing at all, but a form of graphic speech. My term, not McWhorter’s. He points out that most people even after knowing how to read and write speak in bursts of 7 to 10 words, and writing at its origins reflected these kinds of patterns. In other words, as speakers we are all twitterers. Of Twitter, McWhorter says:

 

The only other problem we might see in something both democratic and useful is that it will exterminate actual writing. However, there are no signs of this thus far. In 2009, the National Assessment of Education Performance found a third of American eighth graders – the ones texting madly right now — reading at or above basic proficiency, but crucially, this figure has changed little since 1992, except to rise somewhat. Just as humans can function in multiple languages, they can also function in multiple kinds of language. An analogy would be the vibrant foodie culture that has thrived and even flowered despite the spread of fast food.

Who among us really fears that future editions of this newspaper will be written in emoticons? Rather, a space has reopened in public language for the chunky, transparent, WYSIWYG quality of the earliest forms of writing, produced by people who had not yet internalized a sense that the flavor of casual language was best held back from the printed page.

This speech on paper is vibrant, creative and “real” in exactly the way that we celebrate in popular forms of music, art, dance and dress style. Few among us yearn for a world in which the only music is classical, the only dance is ballet and daily clothing requires corsets and waistcoats. As such, we might all embrace a brave new world where we can both write and talk with our fingers.

via Talking With Your Fingers – NYTimes.com.

 

I mostly agree with this idea, that we are all code shifters. I’m less sanguine than, McWhorter, however. His eighth grade sample takes students before they’ve entered a period of schooling where they’d be expected to take on more serious reading and longer and more complex forms of writing. Twitter may not be to blame, but it’s not clear to me the state of writing and reading comprehension at higher levels is doing all that well. There’s some pretty good evidence that it isn’t. So just as a foodie culture may thrive in the midst of a lot of fast food, it’s not clear that we ought to be complacent in the face of an obesity epidemic. In the same way, just because tweeting may not signal the demise of fine writing, it’s not clear that it’s helping the average writer become more facile and sophisticated in language use.

Is Twitter Destroying the English language?

Coming out of the NITLE seminar on Undergraduate Research in Digital Humanities, my title question was one of the more interesting questions on my mind.  Janis Chinn, a student at the University of Pittsburgh, posed this question as a motivation for her research on shifts in linguistic register on Twitter.  I’m a recent convert to Twitter and see it as an interesting communication tool, but also an information network aggregator.  I don’t really worry about whether twitter is eroding my ability to write traditional academic prose, but then, I’ve inhabited that prose for so long its more the case that I can’t easily adapt to the more restrictive conventions of twitter.  And while I do think students are putting twitterisms in their papers, I don’t take this as specifically different than the tendency of students to use speech patterns as the basis for constructing their papers, and not recognizing the different conventions of academic prose.  So twitter poses some interesting issues, but not issues that strike me as different in kind from other kinds of language uses.

I gather from the website for her project that Janis is only at the beginning of her research and hasn’t developed her findings yet, but it looks like a fascinating study.  Part of her description of the work is as follows:

Speakers shift linguistic register all the time without conscious thought. One register is used to talk to professors, another for friends, another for close family, another for one’s grandparents. Linguistic register is the variety of language a speaker uses in a given situation. For example, one would not use the same kind of language to talk to one’s grandmother as to your friends. One avoids the use of slang and vulgar language in an academic setting, and the language used in a formal presentation is not the language used in conversation. This is not just a phenomenon in English, of course; in languages like Japanese there are special verbs only used in honorific or humble situations and different structures which can increase or decrease the politeness of a sentence to suit any situation. This sort of shift takes place effortlessly most of the time, but relatively new forms of communication such as Twitter and other social media sites may be blocking this process somehow.

In response to informal claims that the current generation’s language is negatively affected by modern communication tools likeTwitter, Mark Liberman undertook a brief analysis comparing the inaugural addresses of various Presidents. This analysis can be found on University of Pennsylvania‘s popular linguistics blog “Language Log”. Remarkably, he found a significant trend of shortening sentence and word lengths over the last 200 years. My research, while not addressing this directly, will demonstrate whether using these services affects a user’s ability to shift linguistic registers to match the situation as they would normally be expected to.

Fascinating question in and of itself. I think on some level I’ve always been deeply aware of these kinds of shifts.  As I kid when my parents were missionaries in New Guinea, I would speak with an Aussie accent while I was with kids at the school across the valley, which shifting back in to my Okie brogue on the mission field and in my house.  And as I made my way in to academe my southern and southwesternisms gradually dropped away with a very few exceptions–aware as I was that my accent somehow did not signal intelligence and accomplishment.  Mockery of southern white speech remains a last bastion of prejudice in the academy generally.  I don’t think these are the kinds of register shifts Janis is looking at, but same territory.

I’m also more interested in the general motive questions.  If we could prove that Twitter inhibited the ability to shift registers, would that count as destroying or damaging the language in some sense?  If we could demonstrate that Twitter was leading people to use shorter and shorter sentences–or to be less and less able to comprehend sentences longer than 160 characters.  Would this signal an erosion in the language.  We must have some notion that language can be used in more effective and less effective ways since we are all very aware that communication can fail abysmally or succeed beyond our hopes, and usually ends up somewhere in-between.  Does the restricted nature of Twitter limit or disable some forms of effective communication, while simultaneously enabling others.  These are interesting questions.  I’m sure more intelligent people than I am are working on them.

Do Humanities Programs Encourage the Computational Illiteracy of Their Students?

I think the knee-jerk and obvious answer to my question is “No.”  I think if humanities profs were confronted with the question of whether their students should develop their abilities in math (or more broadly in math, science and technology), many or most would say Yes.  On the other hand, I read the following post from Robert Talbert at the Chronicle of Higher Ed.  It got me thinking just a bit about how and whether we in the humanities contribute to an anti-math attitude among our own students, if not in the culture as a whole.

I’ve posted here before about mathematics’ cultural problem, but it’s really not enough even to say “it’s the culture”, because kids do not belong to a single monolithic “culture”. They are the product of many different cultures. There’s their family culture, which as Shaughnessy suggests either values math or doesn’t. There’s the popular culture, whose devaluing of education in general and mathematics in particular ought to be apparent to anybody not currently frozen in an iceberg. (The efforts of MIT, DimensionU, and others have a steep uphill battle on their hands.)

And of course there’s the school culture, which itself a product of cultures that are out of kids’ direct control. Sadly, the school culture may be the toughest one to change, despite our efforts at reform. As the article says, when mathematics is reduced to endless drill-and-practice, you can’t expect a wide variety of students — particularly some of the most at-risk learners — to really be engaged with it for long. I think Khan Academy is trying to make drill-and-practice engaging with its backchannel of badges and so forth, but you can only apply so much makeup to an inherently tedious task before learners see through it and ask for something more.

via Can Math Be Made Fun? – Casting Out Nines – The Chronicle of Higher Education.

This all rings pretty true to me.  There are similar versions of this in other disciplines.  In English, for instance, students unfortunately can easily learn to hate reading and writing through what they imbibe from popular culture or through what the experience in the school system.  For every hopeless math geek on television, there’s a reading geek to match.  Still and all, I wonder whether we in the humanities combat and intervene in the popular reputation of mathematics and technological expertise, or do we just accept it, and do we in fact reinforce it.

I think, for instance, of the unconscious assumption that there are “math people” and “English people”;  that is, there’s a pretty firmly rooted notion that people are born with certain proclivities and abilities and there is no point in addressing deficiencies in your literacy in other areas.  More broadly, I think we apply this to students, laughing in knowing agreement when they talk about coming to our humanities disciplines because they just weren’t math persons or a science persons, or groaning together in the faculty lounge about how difficult it is to teach our general education courses to nursing students or to math students.  As if our own abilities were genetic.

In high school I was highly competent in both math and English, and this tendency wasn’t all that unusual for students in the honors programs.  On the other hand, I tested out of math and never took another course in college, and none of my good humanistic teachers in college ever challenged and asked me to question that decision.  I was encouraged to take more and different humanities courses (though, to be frank, my English teachers were suspicious of my interest in philosophy), but being “well-rounded’ and “liberally educated”  seems in retrospect to have been largely a matter of being well-rounded in only half of the liberal arts curriculum.  Science and math people were well-rounded in a different way, if they were well-rounded at all.

There’s a lot of reason to question this.  Not least of which being that if our interests and abilities are genetic we have seen a massive surge of the gene pool toward the STEM side of the equation if enrollments in humanities majors is to serve as any judge.  I think it was Malcolm Gladwell who recently pointed out that genius has a lot less to do with giftedness than it does with practice and motivation.  Put 10000 hours in to almost anything and you will become a genius at it (not entirely true, but the general principle applies).  Extrapolating, we might say that even if students aren’t going to be geniuses in math and technology, they could actually get a lot better at it if they’d only try.

And there’s a lot of reason to ask them to try.  At the recent Rethinking Success conference at Wake Forest, one of the speakers who did research into the transition of college students in to the workplace pounded the table and declared, “In this job market you must either be a technical student with a liberal arts education or a liberal arts major with technical savvy.  There is no middle ground.”  There is no middle ground.  What became quite clear to me at this conference is that companies mean it that they want students with a liberal arts background.  However, it was also very clear to me that they expect them to have technical expertise that can be applied immediately to job performance. Speaker after speaker affirmed the value of the liberal arts.  They also emphasized the absolute and crying need for computational, mathematical, and scientific literacy.

In other words, we in the Humanities will serve our students extremely poorly if we accept their naive statements about their own genetic makeup, allowing them to proceed with a mathematical or scientific illiteracy that we would cry out against if the same levels of illiteracy were evident in others with respect to our own disciplines.

I’ve found, incidentally, that in my conversations with my colleagues in information sciences or math or sciences, that many of them are much more conversant in the arts and humanities than I or my colleagues are in even the generalities of science, mathematics, or technology.  This ought not to be the case, and in view of that i and a few of my colleagues are considering taking some workshops in computer coding with our information sciences faculty.  We ought to work toward creating a generation of humanists that does not perpetuate our own levels of illiteracy, for their own sake and for the health of our disciplines in the future.