Un-habeas corpus

This discussion of Sherlock 3.1 is replete with spoilers.

Other reviewers have said the obvious things and said them well.  Last night’s Sherlock was short on plot, but forgivably so, as it was so very long on delightful cleverness and wish-fulfilling moments.  I could go on, but this fellow has done it for me.  Instead I want to say something I haven’t read anywhere else.  I want to describe an irredeemable logical flaw.

Of all the many questions one might have about the way Sherlock fakes his death:  the relative heights of the buildings that stop John from seeing what is going on, the timing of Mycroft’s interception of the sniper and why, if he has the sniper under control, he can’t just halt the other snipers and call it a day, the speed with which it is possible to inflate and deflate an air-bag, the problematic lack of blood under the body — of all these questions and 50 more, one stands out, never raised and inexplicable:  why is there a corpse at all?  Think about it.  Sherlock hits the airbag, rolls off, stands aside while the bag is removed, runs back and lies down:  There is no need for a corpse.

I’ll be interested in explanations, but I don’t think I’ll get one.  It seems clear to me, after a lot of thought, that the whole drama is played out for John’s benefit.  Mycroft must already be at the sniper — the sniper who is to inform the other snipers — who, in any imaginable case, must have seen some of the elaborate set-up, since the only sight-line from which it cannot be seen is John’s.  But John never sees the corpse.

Of course before seeing the show we all thought there would be a corpse, because of the Molly involvement.  But that was only okay because we were all thinking, like Anderson, that it was corpse on the ground, the corpse whose pulse John takes.  In that scenario you need a corpse.  In the one presented, you don’t.

Arendt and Milgram

In today’s Opinionator, Roger Berkowitz describes the most common misreading of Eichmann in Jerusalem, which has Arendt attributing Eichmann’s actions to following orders and, by extension, using the phrase “the banality of evil” to mean mindless, order-following bureaucracy.  Berkowitz tells us that this isn’t what Arendt says.  She did not portray Eichmann as a mere “clerk”;  this is not the quality that leads her to speak of his “inability to think.”  Eichmann’s thoughtlessness emerges, on the contrary, from what Berkowitz calls his being a “joiner,” that is, his enthusiastic embrace of an ideology.  And the medium by which his inability to think is sustained is his allegiance to cliches.

Arendt knew well that Eichmann was a fervent Nazi and a creative manager of death.  If Eichmann in Jerusalem makes us look more deeply at ourselves, it is not to ask whether we are all cogs in a bureaucratic machine.  It is rather to ask whether we are all ideologues — an ideologue being, in Berkowitz’s words, “someone who will sacrifice his own moral convictions when they come in conflict with the ‘idea’ of the movement that gives life meaning.”  And it is to ask whether we sustain this commitment through a set of commonly accepted and repeated cliches, cliches that ease the realization of the narrative we have bought into and at the same time hide our irresponsibility from ourselves.

I like Berkowitz’s argument, but I’d like to dispute one point.  It’s about Stanley Milgram.

“The widespread misperception,” Berkowitz writes, “that Arendt saw Eichmann as merely following orders emerged largely from a conflation of her conclusions with those of Stanley Milgram, the Yale psychologist who conducted a series of controversial experiments in the early 1960s. Milgram was inspired by the Eichmann trial to ask test subjects to assist researchers in training students by administering what they thought were potentially lethal shocks to students who answered incorrectly. The test subjects largely did as they were instructed. Milgram invoked Arendt when he concluded that his experiments showed most people would follow orders to do things they thought wrong.”

Do Milgram’s experiments really show only that people will follow orders against their moral sense?  There is more to it than that.  To erect a situation in which his subjects would follow orders he had to invoke their status as joiners, and since he did not have time to train them in an ideology he had to use one that was already in place:  their utopian faith in the benevolence of science.  The subjects would in general not have recognized themselves as ideologues in the church of scientific progress, but they were believers enough:  enough that the laboratory they found themselves in, the lab coats worn by the experimenters, the clinical monotone the experimenters maintained, and the pseudoscientific sound of lines like “there will be no permanent damage to the tissue” — these things, these cliches, signaled to them their already accepted membership in something larger, something hopeful:  the promise of science.  Without the ideology and the cliches, there could have been no Milgram. Milgram and Arendt are showing aspects of the same problem.

Berkowitz tells us that “Arendt rejected… Milgram’s claim that obedience carried with it no responsibility. Instead, Arendt insisted, ‘obedience and support are the same.’”  But Milgram is only claiming that being obedient makes us think we aren’t responsible, not that we should be held less responsible.  And isn’t this also the meaning of the line cited from Arendt?  Obedience and support are the same:  Arendt believes it, and Milgram believes it.  Obedience vs. support is, for both of them, a false opposition:  there is no obedience unless you’ve already invoked an ideology, unless the subject has, as Berkowitz puts it, joined.

So Arendt knows full well that there’s a sense in which Eichmann was indeed only following orders.  Which is not to say the scholars dismissed by Berkowitz aren’t wrong:  they are serious misreaders of Arendt if they think she doesn’t know that Eichmann was a fervent Nazi — really this makes it obvious that they haven’t read Arendt at all.  But the other mistake they make is one that Berkowitz makes too:   to think a clerk is ever merely a clerk.  For when Arendt rejects the obedience/support distinction, she is also rejecting the clerk/perp distinction.  The questions we must ask of our inner-clerk are:  what makes you follow orders?  What makes you ally yourself with those who are giving you orders?  What makes you so involved that you will go beyond those orders into a creative application of the ordering ideology?  The “clerk” is not a cypher;  his autonomy was not taken from him.  The “clerk” has given up his autonomy, given it up to something he believes in.

Magnetic paint: go away

The craze these days for chalkboard walls is getting complicated. The latest thing is to use a metal-based paint under the chalkboard paint so that your kid has a wall that is both chalkable and magnetic.  This post is for parents considering this endeavour.  The gist, for all you busy people, is fairly simple:  NO.  For those of you with some leisure, I’ll lay it out now in point form.

Chalkboard paint is friendly and wants you to be happy.

-It smells good.
-It remains mixed in the tin.
-It applies to the wall like cream.
-It cleans up with soap and water.
-And it does what it promises.

Magnetic paint hates you and wishes you were dead.

-It smells like turpentine on steroids, and continues to smell for days.
-It settles at the bottom of the tin within minutes of an industrial shake-up, and can’t be remixed without 20 minutes of muscular stirring.
-It applies to the wall like a lump of metal, which is, in fact, what it is.
-It cleans up with NOTHING.
-And, the coup de grace: it does not do what it promises.

But, I hear you saying, I’ve come across people on the web singing the praises of magnetic paint!  Yes.  I read those things too;  that’s why I decided to use it.  But I’m here to tell you today those people are lying to you . I know why they’re lying too.  They’re lying because this stuff is so malevolent they’re embarrassed to admit they bought it.  They struggled, they suffered, but now it’s over — and the last thing they want is their friends and neighbours to know how stupid they were to get duped by the other liars on the web and the smiley fellow at the hardware store.  I have no such shame. I was stupid.  Two coats worth of stupid.  Learn from my example.

The one thing people on the web do admit is that the stuff doesn’t really work.  You have to get “rare earth” magnets, they say, by which they mean really strong magnets, and even those have problems sticking if you haven’t put on enough coats of the vile stuff.  They say “rare earth magnets” instead of strong magnets in a desperate attempt to make their bad decision look like a super bougie decision:  not just any magnets for us, no! only rare earth magnets!  Feh!  And the only reason they admit this one, glaring, overwhelming problem with the product at all is because it’s the one they can’t hide.  Anyone who’s come into their house and tried to stick a magnet on the wall already knows.

Flipping the classroom

My college is talking about “flipping the classroom” and “blended learning” so I went on wikipedia to find out what they were.  Basically the idea is for a professor put his lectures on video so that the students can watch them before class, reserving classroom time for seminar discussions.  Or, as a second stage (except that for many of the people hyping the project this is the first stage, and the whole point) for a school to purchase a set of standardized video lectures which the student can watch before class, again reserving the classroom for seminar discussions.

Here are my thoughts:

1. Video lectures are the equivalent of a textbook.  In disciplines that use textbooks, I can’t see much objection to an on-line textbook as opposed to a hardback textbook.  My guess is that learning the material would take more time, since the video has to be played in real time while a textbook can be read quickly.  But I can see video demonstrations as useful.

2. Video lectures are not the equivalent of a classroom lecture.  The wikipedia article says that flipping the classroom will mean that “a teacher can spend more time interacting with students instead of lecturing.”  This betrays a misunderstanding.  Lecturing live, in the flesh, even to a large class, is already interacting with students.  Think of the difference between theatre and cinema.

3. Much of the hype about this silly project mentions the Kahn Academy, which sells a video textbook on high school math.  When I took high school math, we used a hardback textbook.  Again, I have no problem with one substituting for the other, but neither is an adequate substitute for a good lecture.  My high school math teacher explained how to do a problem while doing it on the board, with supplementary reference to the history of math, to how it fit with other kinds of problems we had learned, to her husband, bridge, and golf.  We had a friendship with her, and that friendship helped us in our initial comprehension of the ideas she was trying to convey.

4. The project has no relevance to those disciplines where textbooks are not in common use, for instance my discipline.  I assign Mendelssohn’s Jerusalem.  I do not, on top of this, assign the relevant chapter in Norbert Samuelson’s History of Modern Jewish Philosophy, nor do I assign a tape from one of the several Open University courses on modern Jewish philosophy, nor would I ever assign a video of me or anyone else lecturing on the material.  I want the students to work through the primary source without help, to make what they can of it.  I want this both because I want them to learn to read philosophy and because I want each one to light on and ponder the parts of the text that are relevant to the synthetic, critical understanding he is building, not merely to buy into someone’s overarching narrative.  A textbook would flatten our subsequent discussion, and a video textbook all the more so, because of the persuasive authority inherent to that medium.

5. The buzzwords attached to this project, not just “flipping the classroom” and “blended learning” but also “backwards classroom,” “reverse instruction,” and “reverse teaching” are laughable.  Those sound like cool concepts.  What they sound like is my learning from my students at the same time as they learn from me, and us all having a philosophical epiphany as we thrash through some difficult material.  What they don’t sound like is what they actually mean:  my assigning a video textbook instead of lecturing.

6. Some things are meant to be watched.  I feel sure our classroom conversations could be deepened by all my students having watched The Wire, not to mention Sassy Gay Friend.  I also have no objections to their finding youtube footage of Derrida or etc.  Lectures, though, are not meant to watched;  they are meant to be experienced.

*

These comments deal with the situation in my small liberal arts college.  The project would play out differently in the broader setting of the university.  It’s a push toward standardization of thought, and toward the elimination of the professorate;  also it brings all the money there is to be made (which will be less and less, in the case of the increasing success of such initiatives) to a few centres manned by experts in technology.

Grade inflation, and academic incivility (Gill #2)

Earlier this year, I attended a production of 42nd Street at Stratford. It was a satisfactory production, though hardly earth-shaking, but the audience gave it standing ovation. And it was at this point that I realized I had to stop worrying about grade inflation.

What I realized, sitting there in the theatre, was that grade inflation isn’t just an academic problem. It’s a social problem. I don’t want to say that North American society as whole has abdicated its power to judge anything as average or mediocre, but I do want to say that a whole lot of segments of society have: we (whoever “we” are, but bear with me) just don’t ever give anything a B anymore, whether it’s a theatre piece (yay! Bravo! Bravissimo! the best!) or a student essay (good work! A!). It isn’t as if we’re deceived either. I mean, the Stratford audience knew that that production was just pretty good; their ovation was half-hearted and it didn’t last long. And academics know that some of our A’s are, shall we say, A’s of lesser quality. But we can’t not stand up for the show, and we can’t say B.

There are good reasons for this, and they are well known. The push toward critical reflection has made us unsure of our standards. The drive to listen, to be changed by others, to consider different points of view — this makes it awfully hard to pinpoint some views as inferior. In short, it’s hard to be nonjudgmental and to judge at the same time. This is not the place to go on about these matters, though, because I want to say something else.

Taking up one of the themes from my last post, I’m thinking that this nonjudgmental quality, this restraint, might provide another reason academics are so uncivil about one another’s work when sheltered by anonymity. Maybe what’s coming out when we blind review each other with comments like “this is a piece of crap” is the suppressed desire to judge something, anything: we can’t give our students the B’s they deserve but we can damn well give our colleagues a D- or an F. “This is a piece of crap,” should therefore be read as saying: “it’s true I don’t apply any real standards in the classroom, but god dammit I still have them, so my field of study continues to have integrity!”

Of course academics have always exaggerated their petty disputes: the narcissism of small differences has characterized the academy for centuries. We’re all used to back-stabbing and we’ve all been back-stabbed. But still, the nastiness of the new style of peer-review might well be a backlash against our own uncertainty.

And so our internecine hostility grows — so much so that we will never come together against today’s real threat: anti-intellectualism. It is anti-intellectualism, rapidly spreading and intensifying in bitterness, that is behind the accountability culture that seeks to drown us in overwork. We all resent it. We all know that it is we who ought to be in a position to judge: we are the thinkers, we are the judges, we are the people who reflect and compare, we invented the goddam standards! — and it drives us crazy that we are being subjected to treatment we should be meting out. But we collaborate: because our ability to reflect has taken us to a point where we are no longer sure of our own standards and therefore in no position to judge others, and, even more, because we can use the accountability culture to fuel our petty grudges against one another and further our struggle for tiny gains in hallucinatory power.

Internalized oppression in the academy (Gill#1)

I’ve just read a piece by Rosalind Gill of King’s College, London trying to describe something I’ve also been trying to describe for years: the pressures of contemporary academic existence. When I talk about it I usually start by laughing at how we continue to speak of the tension between research and teaching while our daily practice has increasingly nothing to do with either, but instead involves us in middle management roles that come sometimes under the heading of “faculty governance” and sometimes under the heading of “accountability” and mostly in any case just involve writing emails, and answering emails, and filling out surveys, and building websites, and making excel files, and checking other people’s excel files, and attending meetings from which we emerge with more emails to write. My personal approach to email has become almost entirely whack-a-mole. If I see it when I have a minute I’ll bang off an answer, but if it slips away it might as well be gone forever, since every time I sit down at my screen there are 30 more waiting to be dealt with. While I wouldn’t go so far as to describe my situation with the words “a punishing intensification of work,” or “a profession overloaded to breaking point,” I know what Gill is talking about. I rarely read any more, let alone think.

Where Gill is particularly good is in the sense she provides of our acceptance of the new normal, acceptance and collaboration. We recognize that we’re overworked, sure, but we don’t question the sources of the pressure, for instance the bizarrely augmented demand for accountability (or what they call in Britain “audit culture”) which, there as here, was “once treated with scepticism,” but “has now been almost perfectly internalized.” Nor do we raise questions about whether the “’freedom’, ‘flexibility’ and ‘autonomy’ of [the academic job] has proved far more effective for extracting ‘surplus value’ or at least vastly more time spent working, than any older modalities of power.” These are just two of several directions that could be followed up if one wanted seriously to consider how and why we work ourselves to bone, putting up so little resistance to these new demands.

There are a couple of other things Gill doesn’t mention that might augment an account of why we don’t resist. One, not just an academic problem, is the proliferation of distractions, for instance what are technically known as “stupid games” (on which subject see this excellent article). Academics don’t play Angry Birds more than anyone else, but they do play, and they play for the same reasons the rest of the world plays: not to avoid work, but to avoid the guilt that would otherwise fill the hours in which they find themselves unable to work, guilt which, if indulged in, reflected on, and criticized, might lead eventually a desire to change our conditions and those of others. Distraction can’t help but dampen resistance.

Also relevant to the question of collaboration is Gill’s discussion of how peer reviewing has become so much less civil in recent years. “When,” she wonders “did it become acceptable to write of a colleague’s work ‘this is self-indulgent crap’ or ‘put this manuscript in a drawer and don’t ever bother to come back to it’ — both comments I have read in the last year on colleagues’ work.” She suggests two analyses: “repressed rage bursting out as an attack against someone who is not the cause of it” and “[peer reviewing] as one of the few sites where academics may feel that they can exercise some power — thus they ‘let rip,’ occasionally cruelly, under the cloak of guaranteed anonymity.”

It’s related to the question of collaboration because for sure we’re never going to gang up on the masters if we keep tearing away at each other. But it’s still unclear why we’ve taken to doing so. I’m going to take a stab at this in the next post.

RIP Maurice Sendak

Sometimes my class in Children’s Literature goes really well, and sometimes not so well. When it works, it’s because the students are interested in the philosophical issues raised by the texts, and willing to push my analyses in new directions. When it doesn’t it’s because the students resist philosophy. It’s hard to believe you could have a group of kids at an elite college signing up for a course in ChL and then taking an it’s-just-a-kids-book-so-stop-reading-fancy-ideas-into-it attitude, but if even a couple of them do, it damages the whole class. One time I remember I gave a complex analysis of something and looked up at blank hostility — and then one of them put up her hand and said, “it was really sad when the dog died.” This became my watchword for a bad class experience. It was really sad when the dog died.

Now I’m watching out-takes of Stephen Colbert’s conversation with Maurice Sendak, and Sendak is saying that when Jenny, the dog in Higgledy Piggledy Pop, joins the World Mother Goose Theatre Company she’s actually dead. And I ‘m thinking: Jenny dies? My adult reading of this text was coloured by my childhood reading, and I never knew. This is devastating. In short, I am really really sad that the dog died.

Students speak

The two student speeches at graduation yesterday were structured loosely around the usual themes:  how nice life had been at college, how scary it was to be leaving, and how the members of the graduating class should forge ahead and make their mark on the world.  Given the content of the speeches, though, it is entirely mysterious how this mark is to be made, as for all one could gather they might have spent the last four years at summer camp.  Memories of drinking featured prominently, as did sex:  the first speaker mentioned her bikini wax in the course of a list of “firsts,” and the second speaker opened with a joke about how he too was going to speak of her bikini wax, with apologies to her boyfriend.  (Likely it was this that prompted Eila’s third grade teacher, who had attended the ceremony to see her student helper graduate, to ask me whether I found the speeches “inappropriate.”  The word vulgar is no longer in common use.)  Neither of them said anything about politics;  neither took a stand in any way on any topic whatever.  And neither mentioned a class, or a professor, or a book, or an idea.

Part of what accounts for this might be the college’s focus on we used to call extra-curricular education and now call co-curricular education.  One could imagine two reasons for changing the term from extra- to co-, the first being to emphasise the interplay between the two, such that ideas in the classroom were discussed and tested on the playing fields, in public debates and lectures, and over beer — and vice versa, with the things they were thinking and experiencing outside the classroom brought up in seminars where they might be reflected on and challenged.  But I think the actual reason is simply to imply that what they do outside the class is of equal importance to what they do inside.  Which is a short step to more important, or all-important, and in any case severs what should be a meaningful tie.

As I listened to the students speak, I cast my eyes over the list of prize winners at the back of the programme.  It was a heartening list.  I know these students.  Many of them have taken my classes.  They are fabulous, smart people.  All of them, any one of them, would have given a very different kind of speech.  So how are the valedictorians chosen?  And could we change the way?

Words and other languages

A few weeks ago my class had an extensive discussion of the “slutwalk,” in which female students put on provocative clothing (or whatever clothing they like) and parade the campus in order radically to challenge the idea that anyone, however she dresses, is ever “asking for it.”  I had a few thoughts in the course of the discussion, and here is one of them.

My students tend to believe that there are codes inscribed in facial expression, bodily gestures, and clothing — that these form a discourse, beyond words, one that we use to communicate, one that must be understood within a given cultural frame.  And yet they also believe that they may, if they so desire, mute this discourse, un-speak and un-hear it, such that one would no longer be expressing with the body and the face and the clothing, such that not even one’s tone of voice would count, but only words:  no means no, however you say it, and whatever gestures accompany it.

I am interested in this resurgence of the logos, this notion that the word, flat and dead, without accompaniment, without ornament or subtext, and above all disembodied, is the top dog of communication.  It seems obvious to so many people, but to me it seems only legally obvious.  By this I mean that in the kind of legal cases that prompted activities like the slutwalk, it was necessary to draw a line between operative signals and inoperative signals, and the only place where such a line can be drawn with clarity — and thus the correct place to draw it — is between words and everything else.  But leaving aside the legality and speaking philosophically, the decision to draw the line there seems arbitrary.