The Truth of Seeking Truth

IMG_6704One of the most damaging contemporary dicta is that truth does not exist: that all we know is our own perspective, if even that. According to some, if you so much as mention truth, you have revealed your own outdatedness. The pursuit of truth can only lead farther into illusion, some say; to be with the times, one must admit that there’s no ultimate truth at all.

Were it not for its emphasis on being with the times, the above could seem plausible. Again and again, we think we know what happened in a given episode in our lives, only to find out later that our understanding was just a fragment and that the various known fragments do not complete a whole. Not only that, but even if all of the information were available, we could only make sense of it through stories–and stories require selection, emphasis, and sequence. There is no way to convey a full picture, even if it exists; our language, existing in time, does not allow for such complexity and completeness.

Yet much of our experience is sturdy. The bicycle does not turn into a tractor from one day to the next. The slice of pizza does not become a cherry pie in the middle of a bite. If you go to a concert, and you remember it the next day, so do others; if you teach a class, there’s general agreement, the next time, about what the lesson contained, even if not everyone remembers everything. So consistency of experience and commonality of memory point to some reality outside of us, a reality that can be called true.

Moreover, we are disposed to seeking out truth; day after day, we try to find out what really happened, what was really said, what a word means, where a particular thing is located, what causes a particular phenomenon, and what we think; this pursuit is not all in vain, nor does it follow a set schedule. When you find the solution to a math problem, it stays; when you understand a word, the understanding abides, even if it changes over time. Knowing your own thoughts may be the most difficult challenge of all, since you are thinking them even as you examine them. Even so, we probably all have had moments of clarity, of knowing, at least for an instant, who we are.

That we build justice systems, schools, governments, news publications on the pursuit of truth does not, in itself, prove truth’s existence; looking at the history of such institutions, we can find many deceptions and follies. Still, people coming together in a courtroom affirm that through assembling the evidence, hearing the witnesses, and deliberating, a jury can reach a fairer and more accurate verdict than it would without these actions. In the classroom, anyone can make mistakes, but the very existence of mistakes suggests the possibility of accuracy. In newspapers and on news programs, a story can get distorted, but then, over time, others correct the record. At their best, all of these institutions pursue truth instead of claiming to have it–and demonstrate, through their daily work, that such pursuit is possible.

Literature can hold truth, but it does this through seeking. Blake’s “Auguries of Innocence” sounds on the surface like a simple telling of truth, but the truth moves before our eyes, changing color and tone, ambling through grief and delight. Pushkin’s Evgeny Onegin plays with deception and dissimulation but reaches a kind of clarity. Eliot’s “Prufrock” seeks something too, in a muted and doubting way. I cannot think of a work of literature (that is, a work that I would want to reread) that does not in some way seek truth, integrity, precision, form, completion, or clarity (and their necessary companions). It may or may not reach an answer, but it takes the reader from one place of understanding to another.

The search for truth does not move with the times; it may go against the passions and predilections of a given culture or group. It follows its own timing; discoveries and insights do not always arrive on schedule, but may come when unexpected and fail to arrive when expected. How many of us have recognized one of our mistakes long after making it; how many dramatic works rely on such mistiming? It would be better to catch a mistake in advance, but short of that, we take insight as it comes.

Each of us seeks some kind of truth: some with enthusiasm, some with weariness, some with direction and purpose, some with open curiosity. To respect others is to recognize that they seek just as I do–not in the same way or with the same timing, but for similar reasons: they want to understand what they do not now understand; they believe, as I do, that there is something to learn.

I took the photo when crossing the Zagyva last week. Also, I made a few additions to this piece after posting it.

Is There a Human Project?

IMG_6177

It is the season of cherries and ice cream, of ducklings and Scarce Copper butterflies (I think that’s the type in the picture above), of wrapping up the school year and saying goodbye for the summer. Also, the book is almost in press; the last corrections have been made, and I must now think about the release in the fall. While doing this, I find myself questioning certain phrases in the book. At one point I mention the human project. Is there a human project? Or is this yet another phrase that has lost meaning over time?

It exists but abounds with contradictions, oppositions, anomalies, impossibilities. Drawing partially on George Kateb’s Human Dignity, I would define the human project, in part, as our ongoing assumption (and abdication) of responsibility as stewards of nature, including our own. Humans alone have the capacity to act as stewards–or not. Acting as steward involves recognizing what one has done, or can do, to help or harm oneself and others–and who these others are, and why it matters. In this recognition, humans have advanced somewhat, in some ways, over time. Certain things that we recognize as wrong, such as slavery, were accepted not long ago.

Last week I introduced my eleventh-grade students to the song “Amazing Grace,” which a few already knew. I thought it was important for American civilization, especially since we were now touching on religion. I did not know the origins of the song (having missed the Broadway musical and the movie and forgotten a good bit of history); when I read about it, I heard it in a new way.

It was composed by the English Anglican minister John Newton (1725-1807), who, prior to his Christian conversion, had been forced into the slave trade. He had rebelled so often aboard the ships–not on behalf of the slaves, but on his own behalf–that he had undergone lashings, demotions, and finally slavery, when the crew left him in West Africa with a slave dealer. He was finally rescued and brought back to England; during the voyage, he had a spiritual conversion. Slowly, over time, this conversion brought him to abhor the slave trade. This did not happen linearly; he returned to the slave trade, fell ill, and underwent a new conversion. He continued in the trade a few more years, and then in 1754 renounced it completely.

His tract Thoughts Upon the African Slave Trade, written in 1788, thirty-four years after he abandoned the business, repudiates the enslavement and trafficking of humans. It begins:

The nature and effects of that unhappy and disgraceful branch of commerce, which has long been maintained on the Coast of Africa, with the sole, and professed design of purchasing our fellow-creatures, in order to supply our West-India islands and the American colonies, when they were ours, with Slaves; is now generally understood. So much light has been thrown upon the subject, by many able pens; and so many respectable persons have already engaged to use their utmost influence, for the suppression of a traffic, which contradicts the feelings of humanity; that it is hoped, this stain of our National character will soon be wiped out.

If I attempt, after what has been done, to throw my mite into the public stock of information, it is less from an apprehension that my interference is necessary, than from a conviction that silence, at such a time, and on such an occasion, would, in me, be criminal. If my testimony should not be necessary, or serviceable, yet, perhaps, I am bound, in conscience, to take shame to myself by a public confession, which, however sincere, comes too late to prevent, or repair, the misery and mischief to which I have, formerly, been accessary.

I hope it will always be a subject of humiliating reflection to me, that I was, once, an active instrument, in a business at which my heart now shudders.

Hearing those undertones in “Amazing Grace” (although the hymn preceded the tract by two decades or so), I understand the song not as a paean to the born-again experience but as the author’s recognition of profound error. To see that one has been terribly wrong and to change one’s life accordingly: this allows for something of a human project. For by writing what he saw and learned, Newton allowed others to see it too.

I don’t want to be glib about this. Looking at the picture below, I would say that ducks do a bit better with their projects than humans; they lead their little ones, which grow up to have little ones of their own. But ducks also kill ducklings that they do not recognize–and suffer no qualms of conscience, as far as I know. It is not that we humans do so well with our conscience–we continue to do things that we repudiate or simply fail to question–but our conscience also matures, not only through experience in the world, but through encounters with books, speeches, music, plays. In listening to something, we come to take ourselves in measure. Or at least we may. To the extent that we do, we participate in a human project.

I ask myself why I didn’t notice the Broadway musical Amazing Grace, which would have taught me something, even fleetingly, about John Newton. I think I unthinkingly ignored it because of the title. I had heard the song sung mockingly so many times that I had absorbed the mockery. That reminds me to be less sure of my mockeries, especially borrowed ones. Mockery has a place in writing–there would be little satire without it–but it must be informed. In this case mine, though never overt, was also ignorant until now.

IMG_6182

I made a few revisions to this piece after posting it.

The Gift of Criticism

norman-rockwell1A few years ago I edited a student’s piece on Machiavelli; I had recruited it at the last minute for my students’ philosophy journal CONTRARIWISE and found that it needed clearer wording in places. When I presented him with the edits, he said that he accepted some but not all of them: that in a few places he was trying to say something else. We sat down to discuss this. In telling me what he meant, he found the right wording; by the end of our meeting, he had revised the piece to his satisfaction. This happened because he was open to the suggestions but strong enough to make his own decisions. Also, I saw past the particulars of my edits; I wanted to help him find his words, not replace them with mine.

This memory returns as I ponder two recent articles about Amy Cuddy and the power pose: Susan Dominus’s New York Times piece and Daniel Engber’s response in Slate. I find Engber’s article much clearer and more to the point–but he also has the benefit of hindsight, critique, and revision. Dominus may well follow up with some afterthoughts. She tackled a complex and heated topic and (from what I can see) did her best to present it fairly. Yet the article fails to distinguish adequately between personal attack and criticism. I posted a comment, which I am developing a little further here. This piece is not about Cuddy; it’s about criticism itself. (Regarding the power pose study, there are numerous recent comments–from many perspectives–in the article’s comment section and on Andrew Gelman’s blog.)

Here’s the key difference, as I see it, between criticism and personal attack: criticism gives you something concrete to consider, something about the issue at hand, be it your work, your actions, or even your personality. Its aim is to point out areas for improvement. It is not always correct or kind; sometimes critics can be vehement and unsympathetic, and sometimes they make mistakes or show biases. But if it is about the thing itself, if it analyzes strengths and weaknesses in a coherent way, it counts as criticism. By its nature it points toward improvement. It is not necessarily negative; it can recognize strengths and excellence.

Personal attack does not give you a chance to improve. Maybe it comes in the form of vague and veiled hints. Maybe it’s incoherent. Maybe it focuses on your personal life instead of the issue at hand. Maybe it gets said behind your back, without your knowledge. Or maybe it’s about something so fundamental to you that it’s unfair to expect you to change. In any case, when it comes to helpful content, there is no “there” there, at least no “there” that invites you in.

In that light, criticism is a gift, even when the delivery is not ideal. It offers working material. But our culture is not well attuned to criticism; we’re taught to hear the “yay” or “nay,” the “up” or “down,” not the subtler responses. For criticism to achieve its purpose, several conditions must exist.

First, institutions would have to make generous room for error, reexamination, and correction. Universities, schools, scientific organizations, publications should not only acknowledge error openly but treat it as part of intellectual life, not cause for shame or demotion.

Second, the person giving the criticism should do so as frankly and humbly as possible: laying the critique on the table without claiming superiority. There’s some disagreement over whether this should happen in private or public, by in person or online. As I see it, a published work can be criticized anywhere–online or offline, in public or private–but an unpublished work or private act should receive more discreet treatment. Published books get reviewed publicly, after all; there’s no suggestion that a reviewer should contact the author privately before saying something in the New York Review of Books. But if I send someone an unpublished manuscript for comment, I expect this person to reply to me alone (or me and my editor) and not to the world.

Third, the person receiving the criticism should learn to hear it and separate it from the emotion it may stir up. Even thoughtful, carefully worded criticism can be hard to hear. It takes some strength to sort out the upset feelings from the actual content of the words. It takes even more to decide which parts of the criticism to take, which to reject, and which to continue considering. Some criticism incites us to reconsider everything we have done; some draws attention to small (but important) details. To hear and use criticism well is to open oneself to profound improvement.

Just before the final manuscript of Republic of Noise was due, someone who read the manuscript offered me some far-ranging suggestions. I saw her points but didn’t want to apply them rashly, in a rush. To decide whether, how, and where to apply them, I would need much more time than I had. I decided to keep them in mind for the future. I am glad of this decision; the book was the way I wanted it, but her suggestions helped me with subsequent writing.

Why do I say that our culture isn’t set up well for criticism? We aren’t taught how to handle it. As a beginning teacher, I remember being told (at numerous “professional development” sessions) not to use red pen, since it could make a student feel bad; not to write on students’ work, but to use Post-its instead; and to limit the comments to two commendations and two general suggestions for improvement. While some of the gist is good (one should avoid overwhelming students or treat one’s own appraisal as the last word), it assumes students’ extreme fragility in the face of concrete, detailed suggestions. The more we treat criticism as devastating, the more fragile we make ourselves (both the critics and the recipients).

Hearing criticism–actually perceiving and considering its meaning–deserves continual practice. It requires immersion in the subject itself; you can’t practice criticism without practicing the thing criticized. It isn’t always fun, but it can lead to exhilaration: you see, on your own terms, a way of doing things better.

 

Image: Norman Rockwell, Jo and Her Publishor (this title may or may not be correct; I have also seen it as Jo and Her Publisher and Jo and Her Editor). This is one of his several illustrations of Louisa May Alcott’s Little Women. In Chapter 14, Jo publishes two of her stories in a newspaper.

I made a few revisions and additions to this piece after posting it.

Teaching in Vastness

I am ambivalent about Parker J. Palmer’s 1998 book The Courage to Teach, but I return to it as I assemble thoughts on teaching. I treasure passages in this book and admire its durability overall. Palmer makes a vitally important argument: that good teaching comes from the teacher’s identity and integrity. There is no single “successful” pedagogical style; one teacher may teach through lecture and another through dialogue, but if both are deeply connected to the subject and aware of themselves and their students, they can both do powerful work.

A teacher, says Palmer, works on the border between the public and the private—“dealing with the thundering flow of traffic at an intersection where ‘weaving a web of connectedness’ feels more like crossing a freeway on foot. As we try to connect ourselves and our subjects with our students, we make ourselves, as well as our subjects, vulnerable to indifference, judgment, and ridicule” (18). To ward off this danger, according to Palmer, we tend to disconnect—and this disconnectedness hurts education and those involved in it.

All true—but when I read Palmer’s words, and continue to read, I get restless for something more. (He recognizes the danger of sounding pat–but falls into that trap repeatedly.) Yes, identity and integrity are essential to teaching, but there’s something beyond both of them. To have identity and integrity, you must go into something larger than yourself. To hold up at the intersection between public and private, you must be aware of something beyond public and private, something that transcends the two.

Or maybe this is not necessary for all; I have no way of knowing. What is it, though? What is this space or sound or presence that can shape a teacher’s work?

Every day in the classroom, I run up against my own imperfections: I make a mistake, misunderstand something that a student said, get slighly irritated, answer a question too quickly, or find myself combating something internal—an area of ignorance, an excess, a sadness, even a rampant joy. In the moment, there’s nothing much that I can do beyond using my best judgment, which is far from perfect. Then, later, when I sort through the events of the day, something else happens.

I don’t just “reflect” on what went right or wrong. That’s an important (and much touted) part of teaching, but only a part. Reflections, after all, must be informed—and where does that form come from? First, it comes from immersion in subjects—any subjects. I learn as much about teaching philosophy when immersed in Russian or Hebrew as I do when reading Machiavelli. Learning to consider the sounds, shapes, roots, and different meanings of words—learning their tones, weights, and connections—all of helps the teaching. Also, when I study anything beautiful or important, I find out, all over again, what education means and how it happens. That said, there are special reasons to immerse myself in the specific subject I teach—to read and reread Machiavelli, Locke, etc. I find out, over and over again, that there’s far more than I presented or even suggested in the lesson. New lesson plans light up in my mind.

There’s still another kind of immersion. When I go through the events of the day, I find myself in a silent, private dialogue—not with myself, really, or with God (I don’t claim such direct access), but with something a little beyond myself. I am able to sort out not only the practical aspects of what I did that day, not only the ethical aspects, but something else, something that puts the events in their proper place, a place I wouldn’t have seen on my own. Without this, I would lose perspective and become overwhelmed.

For example, last week, in one of my classes, I found myself telling my students about a dream in which one of the assistant principals appeared. (The subject came up because had just popped in the classroom a moment earlier, and a student had mentioned having a dream about him.) My dream was strange and brief, with no embarrassing events. It wasn’t too far off topic, since we were discussing Saul Bellow’s Seize the Day, which is filled with dreams of a kind.  Still, I felt a bit off kilter after telling it. I didn’t know whether I had done the right or wrong thing.

From a practical standpoint, it was a bit of a digression, but it didn’t do any harm. From an ethical standpoint, it was mostly harmless, though it feels “gossipy” to tell about a dream that involves a colleague, even though the person isn’t really involved at all. That said, there was nothing gossipy about the dream, in which I was the conductor of a mostly empty train, and he was giving me driving advice (I think).

But there’s something else to reckon with, beyond practical and ethical matters. I recognized, as I went into rumbling thought, that I was feeling unwell on that day and that my gauges were a little off. I also saw that I was starting, in general, to relax around my students and tell them stories now and then—and figuring out when and when not to do so. There would never be a final, fixed answer, but I was finding my way. This meant that there would be errors, or semi-errors, or things that seemed like errors. It is an important question, when and when not to tell a story, since we are made of stories. I loved the stories that my teachers and professors told me over the years. They didn’t distract from the subject; rather, they made things more vivid overall.

How is this different from “identity and integrity”? It differs from them only insofar as it is their source. I find, again and again, that I am up against immensity, or maybe not up against it at all, but walking and thinking in it—and that this is the honor of teaching. Those running the system ask us to show results, to show that the students have moved from point A to point D. That is a reasonable request, if put in its proper place. Palmer would add that a teacher should teach from the self–a self that inhabits the subject. Yes, I grant that as well. But there is something beyond the self, an invisible teacher without lessons, maybe, who shakes me out of my limited senses and points out signs of life.

Never Forget How to Let Go of a Bad Hypothesis

In blog-land, I know I am an insignificant creature among insignificant creatures. Up goes another blog. Three people read it. There I go posting a comment on someone else’s blog. I put an hour into it, and then look aghast at my day. If blogs get forgotten, blog comments get doubly and triply forgotten.

Not always, though. In late January 2009, when Eduwonkette “hung up her cape,” a comment appeared on her blog. Though anonymous, it clearly came from a wise and knowledgeable person. It is about the importance of admitting that you’re wrong, when you are wrong. (“Eduwonkette” was the pseudonym or “mask” of the magnificent education blogger Jennifer Jennings, now assistant professor in the Sociology Department at NYU.)

I think I was moved to something like tears at the time. Maybe not tears, maybe just a gulp and a lot of thinking. I have thought back on that comment many, many times. I have no idea who wrote it. The person used the pseudonym “Right2BWrong” (just for the occasion, I presume).

I am reprinting it here, with full attribution: it first appeared as a comment on Eduwonkette’s Education Week blog on January 27, 2009, the day after the last day of the blog. I will comment on one aspect of it in a separate post.

Here it goes:

Like everyone here, I am sorry you will not be blogging, but agree that you are making a wise choice. Finishing your dissertation is the key to your future and NYU is not a bad place to make money while you do it.

Since no one else has dared to offer any advice, I will. As you know, anonymity gives people a chance to say what they really mean without the fear of reprisal. So, let me offer this anonymous advice. Whatever else you do with the rest of your life, do not become any of the people your critics once imagined you to be.

As you recall, before your unmasking, many of the people behind the studies and press releases and policy “think” tanks you reviewed tried to guess who you were. What did they guess? Some thought you were a policy wonk whose only interest in data was to score political points. They speculated as to who might be funding you; some wondered about EdWeek’s motivation. Others thought you were a disgruntled DOE employee out to settle a personal vendetta against certain people. Some thought that, given your actual skills with data, you were a tenured academic, an ivory tower radical set to bring down the system without any concern for what might be built to replace it.

These are people who commonly battle it out in educational research “debates.” Is it any wonder your critics assumed you were one of them? But the critics were wrong.

Do you recall what bothered them most? They couldn’t figure out whose side you were on. After all, everyone on both sides of these issues has a vested interest in keeping this battle alive. If schools are not broken, who would be paid to fix them, who would be paid to report that the fix did or did not fix it, and who could build a coalition to fight the fixers or organize those who really believe in fixing? The game is called “cops and robbers.” There is no game called “robbers” because that is not much of a game. But you didn’t want to play the policy game. All you cared about was data.

And you had a secret weapon, the ultimate superhero advantage: Your future and your past were not dependent on the outcome. Consider the work of some people twice your age who have spent a professional lifetime dedicated to a hypothesis that does not seem to supported by the data, most of which has been gathered too late in their careers for them to turn back. Consider the people whose reputations are built on their being the “data guru,” but who you have exposed as being perhaps one standard error below proficient in that role. Even some people your own age are already invested. Consider the work of some people your own age whose dissertations started with a policy conclusion and ended with a lot of data massaging, the numbers caressed until they could provide their funders with a happy ending.

You weren’t invested. You could follow the data. If your hypothesis was supported, you could report that. If your hypothesis was not supported, you could report that. In the blogosphere, you can even publish null results, something not as widely accepted in the academic world.

But soon you will become a bit more like your critics. As you grow in your academic career, you will find that certain results, certain publications, lead to opportunities. A sincere, scientific paper might result in a paid speaking engagement. A line of research on some policy might lead to an offer to head a new research department. In the academy, work that supports the current wisdom will help to secure your tenure. Success supporting a hypothesis may bring offers to edit a journal, write a book, or, who knows, become Dean. Success in the academic world may even lead to offers of much more money from a think tank or policy group, especially for someone who can communicate to a large audience. Oh, the places you could go — with all that money!

Soon, you will enter the world in which your critics live. You have visited many times, but soon you, too, will be a resident. No more green card. Full voting rights. Fully invested in the game.

So, how do you avoid becoming any of the people your critics thought you were? Here is the secret. Never forget how to let go of a bad hypothesis. The world of educational research is full of people who must, absolutely must, be right. Their reputations, their careers, their salaries, their retirement, and their personal relationships — their entire lives are dependent on being right about a hypothesis. Never allow yourself to fall into a position in which you become a slave to a hypothesis.

Years from now, remember that your critics tried to attack you here by proving, just once, that you were wrong about something. Any little analytical error would suffice, even if it was because they had provided you with the wrong data. They thought that by showing you were wrong, they could destroy you. In their world, being right is all that matters, regardless of the data.

The policy wonk, dependent on funders; the disgruntled employee, obsessed with petty squabbles; and the ivory tower radical fighting the system all have one thing in common. None of them can afford to admit when they are wrong. If you think about your heroes, even those who have been in this game for 20 or 30 years, you might realize that they all are people who are still willing to admit when they are wrong. Some of them are blogging, just around the corner…

Remember: Being right is a good defense, but being able to admit that you are wrong is the best defense. It is the secret superhuman strength that all real researchers possess. You have it now. It is yours to lose.

Like others here, I, too, look forward to hearing about your work and hope you will continue to contribute to educational research in the years to come. I hope that you are always right about everything. But the only sure proof that you have not become who your critics wanted you to be will be in the times when you report that you were wrong. I doubt you’ll need to say it often, but you will find a great strength in saying it when you do.

Good Luck,

Anonymous Still