Can we create a humanities for the 21st century?: Reflections on Cathy Davidson

I’ve been invited to serve on a panel at the Lilly Fellows Administrators Workshop this fall in New Orleans, so I’ll use the event as an excuse to revive this blog–famous last words– by reflecting on some reading I’m doing in preparation. Broadly speaking, since we’ve done a lot of work in this area at Messiah College I’ve been asked to talk on how humanities can connect to career preparation as part of a conference that focuses on connecting mission and post-baccaluareate success.

Sometimes I admit that I think these kinds of discussions end up being far too narrowly cast for my taste;  humanists concede that we must do something to address our current and never-ending crisis or crises, and so we talk about career preparation as if it is a concession, something that we will do if we have to do it as long as we can keep doing the idealistic things that we have always done.  Or else something that we will do for the moment even as we look nostalgically to the past or longingly for a future in which the economy is better, our budgets are sound, our classrooms are burgeoning.  On this view, humanities faculty engaging with career preparation is a necessary evil or a pragmatic necessity, but it never really gets to the root of or affects a fundamental understanding of what the humanities are about.  As an administrator, I admit that I have become pretty pragmatic and willing to put up with more than my share of necessary evils.  Nevertheless, I confess that I find this view of engagement with student careers as seriously wanting and deficient.

I think that the halcyon days of yore are not returning, and even if they did it might not actually be all that great a thing.  Rather, I want to believe we are about at Messiah College–when we do curricular revision to include more attention to career concerns, or when we have more training of faculty advisors to address vocational issues,when we work to connect internships, service learning, and other forms of experiential learning directly to our liberal arts course work, or when we begin new projects in the digital humanities–What I want to believe we are about is creating a humanities for the 21st century.

In this, I resonate sympathetically when I read Cathy Davidson, or hear her speak as I did last year at the CIC conference in Pittsburgh.  Davidson’s ruling metaphor, it seems to me, is that our current forms of education, even humanities education, are appropriate to an industrial era, but that we have yet to develop an education appropriate to our own era.

I read again this afternoon her essay on these issues from Academe a few years back, Strangers on a Train.  A passage that particularly stuck out:

If you look at the curriculum in most humanities departments, you would barely notice that there is a crisis and there has been one for decades. At most colleges and universities, humanities departments continue to have a hierarchy of requirements and teaching assignments that imply that the department’s chief mission is to train students for professional careers in the humanities. Most humanities departments do not seem designed to prepare students for any and all careers, including in the sciences, even though all careers require reading, writing, critical thinking, theoretical analysis, historical perspective, and cross-cultural knowledge.

Davidson rightly points out that one consequence of mass education as we have come to know it is that liberal arts programs  have tended to become pre-professional in their orientation, but in a bad or deleterious sense.  That is, we think mostly that we are preparing future graduate students in the humanities, or we organize our curricula as if we are doing that.  Davidson’s essay is a clarion call, if a somewhat unspecific one, to get beyond this form of the humanities for a broader-based approach to the vocational needs of the contemporary students.  Ironically, it seems to me, this might ultimately make our humanities programs more genuinely liberal arts programs, designed broadly rather than for discipline specific expertise.

The one issue that I think Davidson doesn’t address here is one that I think leaves humanities programs resistant to change along the lines Davidson seems to be envisioning.  That is, so long as we argue that humanities programs are the best preparation for a flexible career in the future, that we give students superior skills in communication and analysis, that statistics show our students do relatively well in the job market overall, it becomes unclear why the pre-graduate-school model needs to change.  I have heard this argument stated eloquently.  “Yes, we prepare you so you can go to graduate school;  but if you don’t you’ve been prepared for everything else as well because of all the great communication skills we’ve given you.”

I don’t actually agree with this argument, but it is a genuine argument.  Where it falls short, I think, is in an overconfidence that our students know how to translate knowledge between fields of practice.  This is, I think, a false assumption.  Conversations with business and career development professionals over the past four or five years have convinced me that humanities students regularly and commonly struggle to be able to articulate the relationship between what they have done with their education and the needs of employers.  As I have put it in the past, we broaden our students’ horizons admirably, but we resist teaching them how to walk in to those horizons, or don’t even think to do so.  Indeed, in the worst case, where professors or departments give students only non-instrumental arguments for their fields–“this is inherently worth studying”–we implicitly teach students that they positively should not make connections between their academic fields and some other pathway or endeavor.  Students then not only do not receive practice in applying their knowledge,  they not only are left inarticulate about other career directions, they can come to feel unconsciously that it is inappropriate for them to do so.  I have had students–STUDENTS!–say to me, “I know we aren’t supposed to worry about whether humanities major X connects to a job, but….”

Fortunately, this kind of statement is becoming increasingly rare at Messiah College.  Whatever a humanities program for the 21st century should look like, outcomes ought to include that students have had practice applying their program of study to non-academic work environments, and that students can effectively and shamelessly articulate the value of their program of study in the humanities to future employers.

A podcast of Cathy Davidson’s talk to the CIC, “Educating Students for Their Future, Not Our Past,” is available here.

The Slide show from her presentation is available here.

Advertisements

Dreaming of Heaven? Connection and Disconnection in Cathy Davidson’s Commencement address at UNC

Cathy Davidson and I crossed paths very briefly at Duke what now seems ages ago, she one of the second wave of important hires during Duke’s heyday in the late 80s and 90s, me a graduate student nearly finished and regretting that I didn’t have a chance to have what the graduate student scuttlebutt told me was a great professor.  I was sorry to miss the connection.  And its one of the ironies of our current information age that I am more “connected” to her now in some respects than I ever was during the single year we were in physical proximity at Duke:  following her tweets, following her blog at HASTAC, checking in on this or that review or interview as it pops up on my twitter feed or in this or that electronic medium.

I’m sure, of course, that she has no idea who I am.

In the past several years, of course, Davidson has become one of the great intellectual cheerleaders for the ways our current digital immersion is changing us as human beings, much for the better in Davidson’s understanding.  Recently Davidson gave the commencement address at the UNC school of Information and Library Science and emphasized the the ways in which our information age is changing even our understanding of post-collegiate adulthood in the ways it enables or seems to enable the possibility of permanent connection.

How do you become an adult?   My students and I spent our last class together talking about the many issues at the heart of this complex, unanswerable question, the one none of us ever stops asking.  One young woman in my class noted that, while being a student meant being constantly together—in dorms, at parties, in class—life on the other side of graduation seemed dauntingly “individual.”  Someone else piped up that at least that problem could be solved with a list serv or a Facebook page.  From the occasional email I receive from one or another of them, I know the students in that class came up with a way to still stay in touch with one another. 

 In the fourth great Information Age,  distance doesn’t have to mean loss in the same way it once did.  If Modernity—the third Industrial Age of Information—was characterized by alienation, how can we use the conditions of our connected Information Age to lessen human alienation, disruption of community, separation, loss?  I’m talking about the deep  “social life of information,” as John Seely Brown would say, not just its technological affordances.  How can we make sure that we use the communication technologies of our Age to help one another, even as our lives take us to different destinations?  How can we make sure our social networks are also our human and humane safety net?  

via Connection in the Age of Information: Commencement Address, School of Information and Library Science, UNC | HASTAC.

At the end of her address Davidson asked the graduates from UNC–ILS to stand and address one another:

And now make your colleague a promise. The words are simple, but powerful, and I know you won’t forget them:  Please say to one another, “I promise we will stay connected.” 

There’s something powerful and affecting about this, but I’ll admit that it gave me some pause, both in the fact that I think it is a promise that is fundamentally impossible to keep, even amidst the powers of our social networks, and in the fact that I’m not sure it would be an absolutely positive thing if we were able to keep it faithfully.

The dream of permanent and universal connection, of course, is a dream of heaven, an infinite and unending reconciliation whereby the living and the dead speak one to another in love without ceasing.  But there are many reasons why this remains a dream of heaven rather than fact of life, not least being our finite capacity for connection.  According to some cognitive theorists, human beings have the capacity for maintaining stable relationships with at most about 200 to 250 people, with many putting the number much lower.  I am not a cognitive scientist, so I won’t argue for the accuracy of a number, and I cant really remember at the moment whether Davidson addresses this idea in her recent work, but to me the general principle seems convincing.  While the internet might offer the allure of infinite connection, and while we might always be able to add more computing power to our servers, and while the human brain is no doubt not yet tapped out in its capacities, it remains the case that we are finite, limited, and….human.  This means that while I value the 600 friends I have on Facebook and the much smaller congregation that visits my blog and those who follow me or whom I follow on Twitter, and a number with whom I have old-fashioned and boring face to face relationships in the flesh, I am meaningfully and continuously connected to only a very few of them comparative to the number of connections I have in the abstract.  This leads to the well-known phenomenon of the joyous and thrilling reconnection with high school friends on Facebook, followed by long fallow periods punctuated only by the thumbs up “like” button for the occasional post about  new grandchildren. We are connected, but we are mostly still disconnected.

And, I would say, a good thing too.

That is, it seems to me that there can be significant values to becoming disconnected, whether intentionally or not.  For one thing, disconnection gives space for the experience of the different and unfamiliar.  One concern we’ve had in our study abroad programs is that students will sometimes stay so connected to the folks back home–i.e. their online comfort zone–that they will not fully immerse in or connect with the cultures that they are visiting.  In other words, they miss an opportunity for new growth and engagement with difference because they are unwilling to let go of the connections they already have and are working, sometimes feverishly, to maintain.

Stretched through time, we might say that something very similar occurs if it becomes imperative that we maintain connections with communities, with the relational self, of our past to the extent that we cannot engage with the relational possibilities of our present.  In order to be fully present to those connections that are actually significant to me–even those relationships that are maintained primarily online–I have to let hordes and hordes of relationships die or lie fallow, maintained only through the fiction of connection that my Facebook and Twitter Newsfeeds happen to allow.

Of course, I don’t think saying any of this is necessarily earth shattering.  I am very sure that the vast majority of my Facebook connections are not pining away about the fact that I am not working hard at maintaining strong connections with every single one of them.  Indeed, I doubt the vast majority of them will even know I wrote this blog since they will miss it on their Newsfeed.  Indeed a good many of them are probably secretly annoyed that I write a daily blog that appears on their newsfeed, but for the sake of our connection they graciously overlook the annoyance.

On the other hand, I do think there is a broad principle about what it means to be human that’s at stake.  Connection isn’t the only value.  Losing connection, separation, dying to some things and people and selves so some new selves can live.  These are values that our age doesn’t talk much about, caught up as we are in our dreams of a heaven of infinite connection. They are, however, facts and even values that make any kind of living at all possible.

Grading the Crowd

Can the wisdom of crowds apply to grading student papers, or to evaluation of culture more generally?  What about the quality of a theological argument, or a decision about foreign policy?  We’re taken a lot with the idea of crowds and collaboration lately, and not without good reason.  I think there’s a great deal to be said about getting beyond the notion of the isolated individual at work in his study;  especially in the humanities I think we need to learn something from our colleagues in the sciences and think through what collaborative engagement as a team of scholars might look like as a norm rather than an exception.  At the same time, is there a limit to collective learning and understanding?  Can we detect the difference between the wisdom of the crowd and the rather mindless preferences of a clique, or a mob.  I found myself thinking about these things again this evening as I read Cathy Davidson’s latest piece in The Chronicle Review, “Please Give Me Your Divided Attention: Transforming Learning for the Digital Age.”

Cathy Davidson, "Now You See It"

I wrote about Davidson a couple of days ago–she’s around a lot lately, as authors tend to be when a new book comes out that a publisher has decided to push—and I feel almost bad at taking up  only my crabbiest reactions to her recent work.  First, let me say that I briefly crossed paths with Davidson at Duke where she was hired the year I was finished my doctorate in English.  She seemed like a breath of fresh and genuine air in a department that could sometime choke on its collective self-importance, and the enthusiasm and generosity and love of teaching that Davidson evinces in this essay was evident then as well, though I never had her for class.  And, as this comment suggests, I think there’s a lot in this essay that’s really important to grapple with.  First, her suggestions of the ways that she and some of her colleagues at Duke trusted students with an experiment in iPod pedagogy paid off in so many unexpected ways, and we now know a good bit of that was far ahead of its time.  Moreover, she paints a wonderful picture of students as collaborative teachers in the learning process in her course on the way neuroscience is changing everything.  Still, as with a lot of these things that focus on student-centeredness, I find that promising insights are blinded by what amounts to a kind of ideology that may not be as deeply informed about human action as it really ought to be.  I felt this way in Davidson’s discussion of grading.

 There are many ways of crowdsourcing, and mine was simply to extend the concept of peer leadership to grading. The blogosphere was convinced that either I or my students would be pulling a fast one if the grading were crowdsourced and students had a role in it. That says to me that we don’t believe people can learn unless they are forced to, unless they know it will “count on the test.” As an educator, I find that very depressing. As a student of the Internet, I also find it implausible. If you give people the means to self-publish—whether it’s a photo from their iPhone or a blog—they do so. They seem to love learning and sharing what they know with others. But much of our emphasis on grading is based on the assumption that learning is like cod-liver oil: It is good for you, even though it tastes horrible going down. And much of our educational emphasis is on getting one answer right on one test—as if that says something about the quality of what you have learned or the likelihood that you will remember it after the test is over.

Grading, in a curious way, exemplifies our deepest convictions about excellence and authority, and specifically about the right of those with authority to define what constitutes excellence. If we crowdsource grading, we are suggesting that young people without credentials are fit to judge quality and value. Welcome to the Internet, where everyone’s a critic and anyone can express a view about the new iPhone, restaurant, or quarterback. That democratizing of who can pass judgment is digital thinking. As I found out, it is quite unsettling to people stuck in top-down models of formal education and authority.

Davidson’s last minute veering into ad hominem covers over the fact that she doesn’t provide any actual evidence for the superiority of her method, offers a cultural fact for a substantive good—if this is how things are done in the age of digital thinking, it must be good, you old fogies—seems to crassly assume that any theory of judgment that does not rely on the intuitions of 20 year olds is necessarily anti-democratic and authoritarian, and glibly overlooks the social grounding within which her own experiment was even possible.  All of this does sound like a lot of stuff that comes out of graduate departments in English, Duke not least of all, but I wonder if the judgment is really warranted.

An alternate example would be a class I occasionally teach, when I have any time to teach at all any more, on book reviewing.  In the spirit of democratizing the classroom, I usually set this course up as a kind of book contest in which students choose books to review and on the basis of those reviews, books proceed through a process of winnowing, until at last, with two books left, we write reviews of the finalists and then vote for our book of the year.  The wisdom of the crowd does triumph in some sense because through a process of persuasion students have to convince their classmates what books are worth reading next.  The class is partly about the craft of book reviewing, partly about the business of book publishing, and partly about theories of value and evaluation.  We spend time not only thinking about how to write effective book reviews for different markets, we discuss how theorists from Kant to Pierre Bourdieu to Barbara Herrnstein Smith discuss the nature of value, all in an effort to think through what we are saying when we finally sit down and say one thing is better than another thing.

The first two times I taught this class, I gave the students different lists of books.  One list included books that were short listed for book awards, one list included first time authors, and one list included other books from notable publishers that I had collected during the previous year.  I told them that to begin the class they had to choose three books to read and review from the lists that I had provided, and that at least one book had to be from a writer of color (my field of expertise being Ethnic literature of the U.S., I reserved the right).  They could also choose one book simply through their own research to substitute for a book on one of my lists.  Debates are always spirited, and the reading is always interesting.  Students sometimes tell me that this was one of their favorite classes.

The most recent time I taught the class, I decided to take the steps in democratizing one step further by allowing the students to choose three books entirely on their own accord.  Before class began I told them how to go about finding books through the use of major industry organs like Publishers Weekly, as well as how to use search engines on Amazon and elsewhere—which, their digital knowledge notwithstanding, students are often surprised at what you can do on a search engine.  The only other guidance was students would ultimately have to justify their choices by defending in their reviews why they liked the books and thought they could be described as good works of literature, leaving open what we meant by terms like “good” and “literature” since that was part of the purpose of the course.

The results were probably predictable but left me disheartened nonetheless.  Only one book out of fifty some books in the first round was by a writer of color.  A predictable problem, but one I had kept my fingers crossed would not occur.  More than half the books chosen by my students were from romance, mystery, fantasy lit, and science fiction genres.  Strictly speaking I didn’t have a problem with that since I think great works of fiction can be written in all kinds of genres, and most works of what we call literary fiction bear the fingerprints of their less reputable cousins (especially mystery writing, in my view, but that’s another post).  I thought there might be a chance that there would be some undiscovered gem in the midst.  I do not have the time to read all fifty books, of course, but rely on students to winnow for me and then try to read every book that gets to the round of eight.  It’s fair to say that in my personal authoritarian aesthetic, none of the books that fell in to that generic category could have been called a great work of fiction, though several of them were decent enough reads.  Still, I was happy to go with this and see where things would take us, relatively sure that as things went on and we had to grapple with what it meant to evaluate prose, we would probably still come out with some pretty good choices.

Most of the works that I would have considered literary were knocked out by the second round, though it is the case that Jonathan Franzen’s Freedom made it all the way to the finals, paired against Lisa Unger’s entry from 2010, whose title I can’t even remember now.  In the end the class split almost down the middle, but chose Unger’s book as the best book they had read during the course of the semester.  Not that I thought it was a terrible book. It was a nice enough read and Unger is a decent enough mystery writer.  But being asked to remember the book is a little like being asked to remember my last trip to MacDonalds.  One doesn’t go there for memorable dining experience, and one doesn’t read Lisa Unger in order come up with books that we will care to remember several weeks after having set them down.  But what was perhaps most intriguing to me was that after an hour long discussion of the two books in which students offered spirited defenses of each writer, I asked them that if they could project themselves in to the year 2020 and had to choose only one book to include on a syllabus in a course on the best books of 2010, which book would it be.  Without exception the students voted for Franzen’s book.  When I asked the students who changed their votes why this would be, they said “We think that Franzen is more important, we just liked reading Unger more.”

This is the nub.  Can the wisdom of crowds decide what is most important?  To that, the answer can only be “sometimes”.  As often crowds choose what is conveniently at hand, satisfies a sweet tooth, or even the desire for revenge. Is there a distinction between what is important or what is true and what is merely popular?  Collaboration can lead us past blindnesses, but it is not clear that the subjectivity of a crowd is anything but blind (in my original draft I typed “bling”, a telling typographical slip and one that may be truer and more interesting than “blind.”  It is not clear that they can consistently be relied upon by their intuition to decide what ought to last. This may not be digital thinking, but at least it is thinking, something crowds cannot be always relied upon to do.

If we could really rely on crowds to make our choices, we would discover that there is really very little to choose between almost anything.  Going on Amazon, what is amazing is that four stars is the average score for all of the 100,000s of thousands of books that are catalogued.  And popularity trumps everything:  Lisa Scottoline scores higher in a lot of cases than Jane Austen.  Literally everything is above average and worth my time.  This is because in the world of the crowd, people mostly choose to be with those crowds that are most like themselves and read those things that are most likely to reinforce the sense they have that they are in the right crowd to begin with.  This is true as even elementary studies of internet usage have pointed out.  Liberals read other liberals, and delight in their wisdom and the folly of conservatives.  Conservatives read other conservatives and do likewise.  This too is digital thinking, and in this case it is quite easily seen that crowds can become authoritarian over and against the voice of the marginalized.  My students choices to not read students of color unless I tell them to is only one small reminder of that.

Which leads to one last observation.  I wonder, indeed, whether it is not the case that this experiment worked so well at Duke because students at Duke already know what it takes to get an A.  That is, in some sense Davidson is not really crowdsourcing at all but is relying on the certain educational processes that will deliver students well-attuned to certain forms of cultural excellence,  able to create effectively and “challenge” the status quot because they are already deeply embedded within those forms of culture excellence and all the assumptions they entail.  That is, as with many pedagogical theories offered by folks at research institutions, Davidson isn’t theorizing from the crowd but from a tiny elite and extremely accomplished sample.  As Gerald Graff points out, most of us most want to teach the students that don’t need us, that means most of us want to teach at places where none of the students actually need us.  Davidson’s lauding of her students in the fact that they don’t’ really need her to learn may merely be an index of their privilege, not the inherent wisdom of crowds or the superiority of her pedagogical method.   Her students have already demonstrated that they know what it takes to get A’s in almost everything because they have them—and massively high test scores besides—or they are unusually gifted in other ways or they wouldn’t be at Duke.  These are the students who not only read all their AP reading list the summer before class started, they also read all the books related to the AP reading list and  have attended tutoring sessions to learn to write about them besides.

Let me hasten to say that there is absolutely nothing wrong with their being accomplished.  On some level, I was one of them in having gone to good private colleges and elite graduate programs.  But it is a mistake to assume that the well-learned practices of the elite, the cultural context that reinforces those practices, and the habits of mind that enable the kinds of things that Davidson accomplished actual form the basis for a democratizing pedagogy for everyone.  Pierre Bourdieu 101.

We’re all pre-professional now

I’ve been catching up this evening on backlog of reading I’ve stored on Instapaper.  (I’m thinking “backlog” might be the right word:  I don’t think you can have a “stack” on an iPad).  A few weeks back Cathy Davidson down at Duke University had an interesting piece on whether College is for everyone.  Davidson’s basic thesis, as the title suggests, is no.  Despite the nationalist rhetoric that attends our discussions of higher education–we will be a stronger America if every Tom Dick and Henrietta has a four year degree–maybe, Davidson suggests, maybe we’d have a better society if we attended to and nurtured the multiple intelligences and creativities that abound in our society, and recognized that many of those were best nurtured somewhere else than in a college or university:

The world of work — the world we live in — is so much more complex than the quite narrow scope of learning measured and tested by college entrance exams and in college courses. There are so many viable and important and skilled professions that cannot be outsourced to either an exploitative Third World sweatshop or a computer, that require face-to-face presence, and a bucketload of skills – but that do not require a college education: the full range of IT workers, web designers, body workers (such as deep tissue massage), yoga and Pilates instructors, fitness educators, hairdressers, retail workers, food industry professionals, entertainers and entertainment industry professionals, construction workers, dancers, artists, musicians, entrepreneurs, landscapers, nannies, elder-care professionals, nurse’s aides, dog trainers, cosmetologists, athletes, sales people, fashion designers, novelists, poets, furniture makers, auto mechanics, and on and on.

All those jobs require specialized knowledge and intelligence, but most people who end up in those jobs have had to fight for the special form their intelligence takes because, throughout their lives, they have seen never seen their particular ability and skill set represented as a discipline, rewarded with grades, put into a textbook, or tested on an end-of-grade exam. They have had to fight for their identity and dignity, their self-worth and the importance of their particular genius in the world, against a highly structured system that makes knowledge into a hierarchy with creativity, imagination, and the array of so-called “manual skills” not just at the bottom but absent.

Moreover, Davidson argues that not only is our current educational system not recognizing and valuing these kinds of skills on the front end, when we actually get students in to college we narrow students interests yet further:

All of the multiple ways that we learn in the world, all the multiple forms of knowing we require in order to succeed in a life of work, is boiled down to an essential hierarchical subject matter tested in a way to get one past the entrance requirements and into a college. Actually, I agree with Ken Robinson that, if we are going to be really candid, we have to admit that it’s actually more narrow even than that: we’re really, implicitly training students to be college professors. That is our tacit criterion for “brilliance.” For, once you obtain the grail of admission to higher ed, you are then disciplined (put into majors and minors) and graded as if the only end of your college work were to go on to graduate school where the end is to prepare you for a profession, with university teaching of the field at the pinnacle of that profession.

Which brings me to my title.  We’re all pre-professional now.  Since the advent of the university if not before there’s been a partisan debate between growing pre-professional programs and what are defined as the “traditional liberal arts,”  though in current practice given the cache of  science programs in the world of work this argument is sometimes really between humanities and the rest of the world.

Nevertheless, I think Davidson points out that in actual practice of the humanities in many departments around the country, this distinction is specious.  Many humanities programs conceive of themselves as preparing students for grad school.  In the humanities.  In other words, we imagine ourselves as ideally preparing students who are future professionals in our profession.  These are the students who receive our attention, the students we hold up as models, the students we teach to, and the students for whom we construct our curricula, offer our honors and save our best imaginations.  What is this, if not a description of a pre-professional program?  So captive are we to this conceptual structure that it becomes hard to imagine what it would mean to form an English major, History major, or Philosophy major whose primary implicit or explicit goal was not to reproduce itself, but to produce individuals who will work in the world of business–which most of them will do–or in non-profit organizations, or in churches and synagogues, or somewhere else that we cannot even begin to imagine.  We get around this with a lot of talk with transferable skills, but we actually don’t do a great deal to help our students understand what those skills are or what they might transfer to.  So I think Davidson is right to point this out and to suggest that there’s something wrongheaded going on.

That having been said, a couple of points of critique:

Davidson rightly notes these multiple intelligences and creativities, and she rightly notes that we have a drastically limited conception of society if we imagine a four year degree is the only way to develop these intelligences and creativities in an effective fashion.  But Davidson remains silent on the other roles of higher education, the forming of an informed citizenry being only one.  Some other things I’ve seen from Davidson, including her new book Now You See It, suggests she’s extremely excited about all the informal ways that students are educating themselves, and seems to doubt the traditional roles of higher education;  higher education’s traditional role as a producer and disseminator of knowledge has been drastically undermined.  I have my doubts.  It is unclear that a couple of decades of the internet have actually produced a more informed citizenry.  Oh, yes, informed in all kinds of ways about all kinds of stuff, like the four thousand sexual positions in the Kama Sutra, but informed in a way that allows for effective participation in the body politic?  I’m not so sure.

I think this is so because to be informed is not simply to possess information, but to be shaped, to be in-formed.  In higher education this means receiving a context for how to receive and understand information, tools for analysing, evaluating, and using information,  the means for creating new knowledge for oneself.  To be sure, the institutions of higher education are not the only place that this happens, but it is clear that this doesn’t just automatically happen willy-nilly just because people have a Fios connection.

What higher education can and should give, then, is a lot of the values and abilities that are associated with a liberal arts education traditionally conceived–as opposed to being conceived as a route to a professorship–and these are values, indeed, that everyone should possess.  Whether it requires everyone to have a four year degree is an open question.  It may be that we need to rethink our secondary educational programs in such a way that they inculcate liberal arts learning in a much more rigorous and effective way than they do now.  But I still doubt that the kind of learning I’m talking about can be achieved simply by 17 year olds in transformed high schools.  Higher education should be a place for the maturing and transformation of young minds toward a larger understanding of the world and their responsibilities to it, which it sometimes is today, but should be more often.