Archive for ‘self-irony’

August 7, 2020

Why I won’t be using Zoom

by Carl Dyke

There’s a lot of personal detail in this post. I think it’s necessary, and also in my case pretty funny because I’m empowered to shield myself from the unfunny bits. But if you can’t be bothered I don’t blame you. The tl;dr is that for me and some other people, I reckon, the experience and performance of self is awkward in ways that make personal imaging technologies existentially confusing, disruptive, or even threatening. I don’t think I’m saying anything new here. But as we head into a technology-mediated school term because of the pandemic, I have this to add to the lore of video course delivery and the confounding diversity of human kinds.

My Grandma Liz famously disliked being photographed. This seemed odd to everyone else, because by many standards she was a beautiful young woman and a handsome older lady, with strong features and an intelligent gaze. The standard garbage folk diagnosis was vanity, but her frank discomfort with her own image ruled that out. Some of us chalked it up to the free-floating poisonous critical judgment that can emerge from the family talent for observation and fine discrimination. And certainly it becomes swiftly wearing for a smart, ambitious woman to be constantly reminded that for others she’s little more than a pretty face and a fine rack of lady parts. In any case this was nowhere near the only way Grandma was odd, as are we all, so we all got on with it. I don’t have Grandma’s figure (it’s probably for the best), but I used to get along pretty well with her and I’ve gradually come to believe we had something more permanent in common.

At some point when I was a kid, I remember being given to understand that Dad was concerned I might be showing signs of self-absorption. This was a pretty serious party foul in Dad-world so I installed it as a priority hypothesis to test in a life cobbled together out of experiments. I think the irony must have been lost on me at the time. There was plenty of evidence – I was pretty fascinated with mirrors, or really reflective surfaces of any kind. I looked at myself any chance I got, from every angle I could. Store and car windows were magnetic, personal video selfies before personal video selfies. It probably wasn’t quite obsessive.

Fortunately I was not self-absorbed, at least in the sense of vanity. The issue was not connection but disconnection. I was fascinated with the image because it was obviously ‘me’, but I couldn’t figure out how to get that to make sense. Every time I looked, every step and angle, this uncanny something or other I couldn’t find any way to identify with moved right along with me. In middle school I took the sewing version of home ec and for a few years after that I would buy thrift store shirts and custom tailor them for myself. Badly, which I knew at the time, but it wasn’t really the craft I was concerned with. I have no idea what I thought I was doing at the time, but I was trying, I now think, to get what I looked like to have anything at all to do with how I experienced myself.

Yes, I had a fedora phase. And this:

Felix the Superbeetle and cousin Lindsay

One of my girlfriends in college remarked that when she saw me walking across campus, “it” looked good to her. I was delighted! Yes, nailed it!

Was “it” like that for Grandma too? I have no object permanence to myself. I don’t fear death, because how would it be different? To this day, when I see my reflection in a mirror, in a photograph, or on video, my first reaction is “what the hell is that.” Every. Time. From one moment to the next, I have no damn idea what I look like. Obviously I get queer, and for what it’s worth I count normal as a genre of queer. I get the horror of being pinned into any of the categorical identities, and the further horror of having to inhabit them in self defense. I can really understand why some people automate their self-presentation with stereotyped hair and wardrobe constructs, and I’m sympathetic with the chaos that must break back into their lives when that presentation is disrupted. But when they expect it of me as well I draw the line. It’s not that I want to fight that battle, but I don’t want to live it either. I is the kaleidoscope you see (I guess?), for better or worse.

All of this is stuff I’ve long since learned to manage, or at least live with. The Carl-bot is a practiced performance in many settings, and lets me peek out around the edges of ritual and expectation to express my care in the ways I care to express my care. But the bottom line is that having or making an appearance is an active and chaotic and distracting process for me. It’s work, and adds to the multi-tasking burden of all the other chaotic feeds I’m getting from environments full of other critters like and unlike me commanding my attention in various ways.

Seeing that work reflected back at me in realtime is mesmerizing and awful. Thinking about it happening on all of the other screens is an infinite regress of confounding self-reflection. I know I can turn off my video. I’m not telling you a problem and I’m not interested in your solutions. I’m an adult, responsible, smart, and adaptable. I guess? What I’m saying is, this is why I won’t be using Zoom.

June 26, 2018

The very idea

by Carl Dyke

Last week, after a whole bunch of stalling, I went in to the local health care provider for my intake physical. We’ve been in our new location for three years with no primary care, but neither of us likes how the medical industrial complex works or how it works us, so we haven’t been eager to get ourselves reengaged with it. As usual I liked the new folks fine and everything went fine. That’s not what this post is about.

As part of the intake the screening nurse asked me a bunch of medical history kinds of questions. One of them was whether I’d ever had suicidal thoughts. Because I was in an honest question answering mode I said of course I have, routinely. This answer threatened to change the room and involve me in the kind of relationship to medicine I seek to avoid, so I spent the next couple tense minutes walking it back, until eventually I had never of course actually thought of actively taking my own life. Which, in a narrowly literal kind of way, is truthy enough and a workable compromise for all concerned.

The more robust truth is that to me suicide has always been an interesting idea. It seems like obviously among the live options under certain circumstances, and therefore well worth being mindfully aware of in case those circumstances. To me, and this is what the post is about, the idea doesn’t become real until it’s called forth as a real live option under real live circumstances. Until then it’s just an interesting way of being aware of and in the world, a kind of inexpensive experiment, and a way of being alive to possibilities not immediately in play. So I’ve thought suicide all the way through, many times, without so far reaching the pragmatic threshold where it’s what I might want to do right now. Have I ever had suicidal thoughts? Of course I have. I’m a thinking person.

In general this is how ideas work for me. They are not, at all, where my reality is. The idea of suicide has no power to kill me, any more than a recipe for hummus is a delicious and nourishing snack.

This is pragmatism. It’s also Marx snarking at the idealists in The German Ideology:

Hitherto men have constantly made up for themselves false conceptions about themselves, about what they are and what they ought to be. They have arranged their relationships according to their ideas of God, of normal man, etc. The phantoms of their brains have got out of their hands. They, the creators, have bowed down before their creations. Let us liberate them from the chimeras, the ideas, dogmas, imaginary beings under the yoke of which they are pining away. Let us revolt against the rule of thoughts. Let us teach men, says one, to exchange these imaginations for thoughts which correspond to the essence of man; says the second, to take up a critical attitude to them; says the third, to knock them out of their heads; and — existing reality will collapse.

These innocent and childlike fancies are the kernel of the modern Young-Hegelian philosophy, which not only is received by the German public with horror and awe, but is announced by our philosophic heroes with the solemn consciousness of its cataclysmic dangerousness and criminal ruthlessness. ….

Once upon a time a valiant fellow had the idea that men were drowned in water only because they were possessed with the idea of gravity. If they were to knock this notion out of their heads, say by stating it to be a superstition, a religious concept, they would be sublimely proof against any danger from water. His whole life long he fought against the illusion of gravity, of whose harmful results all statistics brought him new and manifold evidence. This valiant fellow was the type of the new revolutionary philosophers in Germany.

Ha. So anyway, it is from this disposition that I react with dismay to people who, speaking with great moral conviction, hold that there is no reason to come to any kind of understanding with people who entertain and articulate certain kinds of dangerous, harmful ideas. There’s no such thing. This is just, literally, narrow-mindedness. But also, that’s an interesting idea to me – that ideas could be so important, so immediately real, that they need to be opposed in themselves, as such. I think the world must be a very different kind of place for people who experience ideas with such concreteness.

April 8, 2018

Politics? In MY classroom?

by razumov

(This untimely post is in honor of Chuck, who certainly must have thought about these things over the course of his life.)

Two things have happened to me recently. One, I got a tenure-track job at a university where the students have a professional and not just a personal interest in learning about Russian history. Two, I became politically active, to the extent that joining a socialist organization and doing stuff with them a few hours a week is considered active. It’s my second semester now and I’m teaching Intro to Russia Since 1825–and, of course, this being the revolutionary centennial school year, thinking about the eternal question of Politics In The Classroom.

As an undergrad, even a politically-opinionated one, my opinion on this topic was unequivocal. I did not want to hear about my dumb professors’ political views because I knew that these would amount either to the tepid NPR liberalism I got plenty of elsewhere or something noxiously right-wing that would be even worse. I had enough acrimonious debates with profs in seminars that I knew that a prof who had trouble concealing his (usually his) politics was also unlikely to argue for them in good faith. Instead my favorite classes were the ones that seemed to point to an escape from the political tractor beam of the late Bush era.

As a professor, I’m much less confident of all this than I used to be. First of all, of course, there’s no way to teach the history of Russia’s twentieth century without “classroom politics,” if nothing else because students come in with preconceived ideas shaped by a deeply political process. Even if it were possible, though, would it be desirable? As a socialist I want to help people understand the Soviet experience in the light of its real strengths and weaknesses, not through the kind of propaganda that still wins Pulitzers. As a scholar….I want the same thing. (I mean, duh. I wouldn’t have beliefs if I didn’t think they were true.)

Yet converting this growing comfort with classroom politics into actual teaching has been surprisingly hard. A lot of what I try to do in my lectures–the debunking aspect–involves my mental image of what students already believe. To my surprise, I’m consistently off in my evaluation of these beliefs. The whole class pretty much already understood that the Soviet Union’s role in WWII is consistently downplayed in US schools, for instance, and their opinion on the place of Jews in Imperial Russian and Soviet life (a Fiddler on the Roof narrative I’d thought was fairly widely shared) was in fact pretty much nonexistent. Half the time I must be confusing them awfully, the poor things, as I shadowbox with an opponent not relevant for American students since the 80s. (Maybe next year I’ll do a writing exercise at the beginning of the semester where I ask them to present their priors and then at the end to revisit them.)

The flipside of this is that I’m finding that my interventions make little difference anyway. I assigned an article legendary in my field for marking a shift away from both the totalitarian and revisionist models of Stalinist individuality (Jochen Hellbeck’s “Fashioning the Stalinist Soul”), but my students felt no compunctions about fitting it into their familiar totalitarian view of Stalinist life. Hell, maybe they’re right.

At least, if nothing else, my rant about Nineteen Eighty-Four being the worst possible book for understanding the Soviet Union will stick. I hope.

January 6, 2013

A Question Haunts America

by johnmccreery

Us folks on the left are not the only ones who see the U.S.A. as going to hell in a hand basket. My title is taken from the first line of an article in The National Interest by conservative pundit Robert W. Merry titled Spengler’s Ominous Prophecy. Oswald Spengler that is, the author of Der Untergang des Abendlandes (The Decline of the West), now as rarely mentioned on the left as Gramsci is on the right, a mention-and-skip author in liberal higher education with its focus on the Enlightenment and progress toward the great kumbaya of universal rationality. Perhaps he deserves a second look.

He did, after all, anticipate the world wars and the “surge of imperial fervor and a flight toward Caesarism” that seems to afflict all civilizations when their roots in naive but authentic culture give way to “the domain of a few rich and powerful “world-cities,” which twist and distort the concepts of old and replace them with cynicism, cosmopolitanism, irony and a money culture” (the quotes are from Merry, not Spengler himself).

The idea that civilizations are organic wholes and develop through cycles from birth and flourishing to maturity, decay and death is no longer fashionable. But the notion that there is no universal humanity, only human beings, born incomplete animals, who become what their cultures/civilizations encourage and demand that they do is, albeit debatable, Anthropology 101.

Perhaps it is my years, rushing all too soon toward three score and ten (now just a year and a bit away) that turn my thoughts in this direction. But could it be that left and right, we all need both Marx and Spengler to see the world whole through disillusioned eyes?

February 14, 2012

Aggregate, Arrange, Assemble

by Carl Dyke

Today I had an ambitious day. I described paper writing to my intro World History sections as a process of aggregation, arrangement and assembly similar to the formation of stars as they collect atoms, compact them to fusion and burst forth in light. Then I told them about the episode of “Trailer Park Boys” in which Ricky breaks into a house to pick out an engagement ring for Lucy (aggregation = research), swallows the ring so he won’t get caught by the cops and throws it up again once they’re gone (arrangement = analysis), then hands it to Lucy and says “So, you want to get married or something?” (assembly = writing).

We talked about what’s wrong with stealing the ring (this would be the ‘plagiarized’ paper) and whether making Ricky a Viking who ‘plundered’ rather than ‘stole’ it made a difference. We considered why Lucy might have preferred a more ritualized arrangement of their eventual assembly, concluding that in this case the value of ritual lay at least in part in its enactment of focused competence and commitment in making arrangement for the assembled couple’s needs. It’s about credibility. We all agreed that the same ring might be stolen, plundered, bought or fabricated, transported in one’s guts or a velvet box, delivered via slingshot or placement in a glass of champagne, with each permutation of aggregation, arrangement and assembly making a significant difference in the meaning and value of ‘the same’ ring.

I took out some nice artisan multigrain bread I had aggregated to myself earlier and ate some. We talked about the process of chewing and digestion whereby the previous arrangement of the bread is broken down, rearranged into more directly nourishing compounds and waste, and ultimately reassembled into poo and me. We laughed a bit about making sure that these two assemblages not become mixed, and considered the consequences of substituting Skittles for bread in one’s regular diet. We talked about the paper that would result from just vomiting the bread back up or pooping it out without nutritional processing.

They may not immediately have digested all this, but they were intrigued and I had lots of fun.

January 30, 2012

Word to your Mama

by Carl Dyke

I had a little fun with my scifi reading circle last week. They were pretty cranky about Gibson’s Neuromancer (although they picked it), which wasn’t giving them a nice clean linear narrative or conventionally identifiable / likeable characters. I told them it was all about getting cool with the unfamiliar, a slow difficult process in contrast for example to dating, boinking and marrying the woman who reminds you most of your mother. (It was boys doing the most vocal kvetching.) They were stricken.

[Update: It occurs to me that in a roundabout way this is one answer to Tim Burke’s question in his current post about why we think critical thinking should be work, not fun, or why we are suspicious of people seemingly just having fun.]

August 2, 2011

Steering and the ruts

by Carl Dyke

“He told me years later that serving the church in Oxford reminded him of driving an old Model T Ford on a muddy country road; the steering column had so much play in it that turning the wheel didn’t do much good and the car just followed the ruts anyway.”

Tim Tyson, Blood Done Sign My Name

April 18, 2011


by Carl Dyke

is the name of a blog described as “a guide to living with your philosopher.” The current post addresses philosophers’ constant questioning of every little thing, diagnoses this as an occupational hazard, suggests being complimented that they find you worth taking seriously, and recommends a ‘safe sentence’ to use when you’re just not into being prodded about your premises and commitments.

Philosophers are every bit as weird as any Papua-New Guinean highlander, Kalahari bushman or stock analyst, so this ethnographic site is both inherently interesting and potentially critical to maintaining cordial relations with the tribe in question.

January 20, 2011

How many times must I tell you?

by Carl Dyke

I noticed myself doing something interesting today. On Tuesdays and Thursdays I teach three sections of introductory World History back to back to back. We were doing a document analysis using my critical reading rubric. Inevitably I end up providing some of the same guidance from section to section, so that by the third section, from my perspective I’m saying the same thing for the third time.

I share a common prejudice that people who need things repeated to them three times might not be all that bright. (Actually, since I had the students divided up into smaller work groups among which I circulated, I said some of the same things way more than three times.) I know there can be reasons repetition might be needed that have nothing to do with intelligence, so I can usually intercept my first reflex reaction. But the point here, of course, is that I was not repeating myself to the same people; it just felt that way by the end of a long day. And as a result I noticed myself reflexively feeling as if the third section might be a little dim – when in fact they picked up the task and performed it every bit as well as the earlier sections.

It’s interesting to think what kinds of effects might accumulate over a long semester, or career, of letting this dynamic play out. Just a little more impatience in my body language, a little less care in explaining the ‘third’ time, or conversely the kind of elaborate patient overexplanation one may lavish on the slow. How much difference do such subtleties actually make?

December 6, 2010

Monologue tolerance

by Carl Dyke

As you may know, Bob, I was trained in one of the smaller and more obscure subdisciplines, a little thing we like to call ‘Intellectual History’ (or sometimes ‘intellectual and cultural history’ if we’re aware, however dimly, that people other than official intellectuals have an intellectual history). Even in the high academy we’re pretty ornamental and there aren’t usually a lot of us around. So it’s been a blessing of sorts for me to live and work just near enough to the Raleigh/Durham node of big research universities to be able to attend the meetings of the Triangle Intellectual History Seminar.

The seminar often brings in bigwigs to talk about their work in progress, and also offers a forum for members and their advanced graduate students. The level is high and the distribution of expertises is broader than someone outside our little field might think possible. In general the room is packed with very smart people who know a lot of stuff, so in principle it ought to be a thoroughly stimulating experience – you know, like a conference. And even better than most conferences, papers are distributed beforehand and we’re all there intentionally, so everyone arrives prepared on the topic of the day and there’s no need for the slow death of droning paper delivery.

In practice of course there’s a little of that droning, by way of introduction, but it’s mercifully brief and usually offered with some ad libs to keep it fresh. But by academic standards we get down to discussion remarkably quickly, and here is the perfect opportunity for the exciting exchange of ideas that we all imagined academe to be!, before graduate seminars, freshman surveys, and committee meetings blew our brains out like egg yolks. Except that even here, where conditions are seemingly ideal, that exciting exchange does not take place.

Why? Well, there are just some logistical issues when you’ve got 15-20 smart people who all have things to say and can’t say them at once. Can’t have the loud and the quick dominating the discussion, so everyone gets a turn. Time is limited so followups have to be moderated and tangents discouraged. And although everyone likes a good joke, we wouldn’t want to short the presenter on the serious discussion about her important work that she deserves.

The result of these reasonable considerations is that nothing resembling conversation actually takes place. Because she knows she’ll get one shot to say what’s on her mind and then the turn will pass to someone else with their own fish to fry, each speaker produces a well-crafted monologue so dense with premises and implications that the presenter can only respond to a fraction of it, of course with another monologue. And of course all exchanges radiate from the node of the presenter, with no direct interactions between the other participants. It’s all very orderly, lots of smart stuff gets said, it’s productive, certainly worthwhile, even beautiful in its way; and there’s no transformative effervescence, no spark, virtually no chance of the happy accidental flashes of insight that come from free-flowing conversation, improvisation, riffing call and response, theme and variation, the jazz of the mind.

I said there was no conversation, but that’s not quite right. There is, but it’s on a very slow and ponderous (in the sense of pondering) rhythm. As I sit in that room aching for something a little more upbeat, it occurs to me that success in the high academy is in part a function of tolerance for monologues, both delivering and receiving: relatively short ones like those in the room, longer ones like lectures and journal articles, really long ones like books. For ordinary mortals this kind of monologic sensibility is just plain rude, but for the beasts of academe it’s the measure of seriousness. We discipline our young to patience for the monologues of others, and patience for the development of their own; and tsktsk at the minds both bright and dull who won’t or can’t adapt to the deliberate pace of our conversations. No wonder serious academics are leery of bloggery.

Which brings me to my last point. The paper last night was by Lloyd Kramer, a very good historian who was engaged in it in a conversation about the right way to do history with his graduate advisors, now very old, and R.R. Palmer, now dead. There was a bit of a recovery of Palmer, an old-school big-picture synthesizer, as against the more fragmented, conflicted history derived from post-structuralism that followed. This is a conversation in which the monologues are at the scale of oeuvres and generations, or rather in which it is only at that scale that the apparent monologues resolve into utterances in a very ponderous conversation indeed. In the course of the ‘discussion’ Lloyd mentioned that one difference between these generations had to do with their understanding of selves and identities: as primordial and singular for Palmer, as dialogically constructed and plural for the post-structuralists. Here I wanted to say that it didn’t take post-structuralism to see self and identity this way, since the insight was there already in Hume, Hegel, Nietzsche, James, Mead and DuBois to name a few. But I held my tongue, and thought about what kind of selves are constructed out of dialogues that take hours, years, lifetimes and generations to unfold.

June 28, 2010

Are teachers like coaches?

by Carl Dyke

Well, for one thing in high school lots of teachers are coaches. But I’m going to focus on coaches of big famous sports teams. There are some illuminating similarities, and the differences have a laboratory feel to them for thinking about how both teaching and coaching work and don’t work. I’ve been intrigued by John Doyle’s series of posts at Ktismatics questioning whether teachers actually cause students to learn, based on an extensive survey of studies that pretty consistently show they don’t. We could ask the same questions of coaches and winning.

To set the scene, John finds the data pointing strongly toward genetic (or at least early-childhood) hardwired dispositions to educational performance. In contrast, study after study has failed to find much impact on student outcomes from different teaching or learning styles, experience levels, specialized training, or any other teacher variable. Generously, John’s conclusion in the most recent post, “The Students Make the Teacher,” is that “kids would spool out their genetic intellectual potentials within the constraints imposed by their culture regardless of who their teachers are, but that’s not to say that they need no teaching. Rather, as long as they’re not abusive or neglectful, teachers are probably pretty much interchangeable over the long run. So my bet is that regardless of what sorts of educational outcomes are measured, differences between teachers will prove minimal.” In short, students are going to learn what they’re going to learn almost no matter what.

Of course like most teachers I’d like to take credit for all those Aha! moments that happen in and around my classroom, and I’d like to blame the kids who don’t get it for being recalcitrant. But I’ve long suspected that neither position is well-warranted, not to mention that they’re transparently ideological, so I’m open to John’s suggestion to “be a good enough teacher, rather than one who’s too caught up in performance anxiety and delusions of massive impact on kids’ lives. Enjoy the job, recognizing that ultimately it’s the kids’ job to develop and to learn. Then relax, have some fun, honor the kids’ autonomy, let your own personal style shine forward, and the teacher and the kids might actually enjoy the ride together.”

So what about coaching? John says students bring scholastic performance with them and teaching has little to do with it. A parallel argument would be that athletes bring competitive performance with them and coaching has little to do with it. If this were true, a coach with good players would look brilliant, while the same coach with bad players would look like a dog. And in fact this seems to be the case. In the NBA, for example, Doc Rivers had moderate success with a moderately-talented lineup in Orlando before being fired for stagnant performance. Subsequently the Magic drafted Dwight Howard, signed Rashard Lewis and traded for Vince Carter, becoming one of the dominant teams in the East under journeyman coach Stan Van Gundy. Meanwhile, Rivers won an NBA championship coaching the Boston Celtics, who added Kevin Garnett and Ray Allen to an already-strong roster of role players led by star Paul Pierce.

The acknowledged superstar of NBA coaching is Phil Jackson, who won multiple championships with the Chicago Bulls following the maturation of Michael Jordan and acquisition of Scottie Pippen. He then went to the Lakers where he won with Kobe Bryant and Shaquille O’Neal, did not win following the departure of O’Neal, then won again with the arrival of Pau Gasol. Clearly his success is player-dependent, but it should be said that his chief merit is that he puts his players in position to succeed; he is a shrewd evaluator of talent and disposition, as witness his ability to get full value out of brilliant but mercurial prima donnas Dennis Rodman and Ron Artest, not to mention Jordan, Pippen, Bryant and O’Neal themselves. I think this is characteristic of both good coaches and good teachers, and it’s not a small thing; teams of superstars without this sort of enabling coordination regularly implode, as witness France in this year’s soccer World Cup.

In NFL football, Bill Belichick is an excellent example of the hypothesis. He was a total dog with the talent-poor Cleveland Browns, then became a genius with the talent-rich Patriots. His excellence as a game-planner did not change, but it was not enough without Tom Brady and Randy Moss in their primes running the plays. Again, Belichick is a shrewd talent evaluator who identifies his players’ strengths and puts them in position to succeed, but without those strengths, as more recently with the injury and decline of Brady, Moss, Wes Welker and other core players, he is helpless to be the difference that makes the difference. Similarly, Paul Holmgren understood the connection of personnel to coaching well enough to insist on controlling both in Seattle. Unfortunately he turned out to be a mediocre judge of talent (see: Branch, Burleson) and was not able to repeat the Super Bowl success he enjoyed in Green Bay with a team assembled by general manager Paul Wolf.

In college sports it is widely known that the best coaches are first and foremost the best recruiters. All else being equal, which it usually is, the best players win. Coaches who can both obtain those players and put them in positions to succeed are of course at a premium, and coordinated teams of good players regularly beat packs of feral superstars, but even here the coach’s merit is in identifying and channeling the existing talents and dispositions of her players. And given the rapid turnover of rosters in college sports, coaches who were geniuses with great players a few years ago are regularly has-beens looking for work when the talent level drops off.

It is also generally understood that over time players will begin to tune out even the most successful coaches. A great recent example of this is the NHL’s Peter Laviolette, a coach who specializes in increasing the intensity of underperforming or undertalented teams. After beginning his career by improving the talent-poor New York Islanders marginally he wore out his welcome and moved to the Carolina Hurricanes. There he lit a fire and got maximum effort out of a moderately-talented team, pushing them to a Stanley Cup. Within a couple of years his approach had burnt the players out, he went from genius to dog, and after a dreadful half-season he was fired. Whereupon he was hired this year by the talented but drifting Flyers and promptly became a genius again, driving them to a Finals appearance. If history holds true (many other coaches fit his description, for example Mike Keenan) he has maybe one more year before the players tune him out or rebel against the constant pressure. The teaching equivalent of Laviolette is Jaime Escalante, the “Stand and Deliver” guy. He was undeniably successful in activating the latent talents of his students, but the pressurized environment he created proved unsustainable.

In this year’s World Cup the Italian coach, Marcello Lippi, was clearly a dog as his talented team, the defending champions, failed to win even one game against lesser opposition and were eliminated in the first round. Yet Lippi had been the coach for the World Cup win four years earlier, just as clearly a genius with an unparalleled record of success. “He was named the world’s best football manager by the International Federation of Football History and Statistics (IFFHS) both in 1996 and 1998, and world’s best National coach in 2006. He is the only coach in the world to have ever won the most prestigious competitions both for clubs and for National teams. In 2007 the Times put his name on the list of top 50 managers of all time.” He will shortly be replaced as coach and it’s likely the team will perform better, but will that be because the old guy was bad and the new guy is good?

If the coaching/teaching analogy holds, all of this ought to be quite humbling for all of us would-be Svengalis. Our upside is limited by that of our Trilbys, and our downside is as far down as they care to take us. When the chemistry comes together we can sometimes be catalytic, but this can’t be counted on as the normal situation and often enough a good chemistry requires our removal. Under these circumstances I can certainly understand why we’re paid so little, as we often complain, despite performing what is magically thought of as socially necessary labor. Fortunately the learning that really needs to happen will happen anyway, and maybe along the way we can “relax, have some fun, honor the kids’ autonomy, let [our] own personal style shine forward, and … enjoy the ride together.”

May 7, 2010

An obsessive consistency

by Carl Dyke

I suppose most good teachers wonder if they’re reading and grading students’ work consistently and fairly. Because I allow students to rewrite the first papers of the term (and all failing papers) I have a kind of opportunity to check myself on that. Many rewrites are perfunctory or spotty, so regularly I’m rereading the same stuff I read the first time. With seventy or so papers to read at once I certainly don’t remember each one, so functionally I’m reading it anew. And although I require the original version to be attached to the rewrite, I do not automatically check it over before I start reading the new version.

I’m working through a stack of rewritten papers right now, and just had a not infrequent Aha moment: I wrote a comment in the margin of one then, curious, checked the original. Beside the exact same sentence there was the exact same comment, in the exact same wording. I’m feeling pretty consistent right now.

This begs the question whether I’m just consistently biased, which I will admit is true. I am biased toward what I consider ‘good’ papers, and I have embedded those biases in explicit assessment criteria in the syllabus and grading rubric. It also more disturbingly raises the question whether I am wasting my and the students’ time commenting on the papers, or commenting in the way I do. In the sense that the students feel attended to and accept the legitimacy of the grade, I think the comments do their job. But I’d like them to be guides to better performance, which in these cases has clearly not paid off. In other cases it does, as I can also see from the rewrites; and where it doesn’t, it seems to be the students who are used to lots of grammatical red ink and baffled by questions about the actual content of their essays. So there, the comments may not pay off directly, but contribute in a small way to developing a new habit of mind about the communicative functions of writing. Or so I’d like to think.

February 8, 2010

Relative (in)competence

by Carl Dyke

I spent much of the winter break re-tiling the kitchen floor. It was a moment in a general experience I have as a home do-it-yourselfer, the ‘man of average mechanical ability’. Each new thing that needs doing is something I’ve never done before. I have to learn principles and techniques from scratch, take way too long because I’m unsure of myself, and still make all the newbie mistakes. By the time I’ve internalized the rules of the job and started to get a feel for its practice, I’m done and probably won’t be doing it again soon enough for the acquisition to stick.

Rachel and I first tiled the floor back when we were friends before we hooked up. She had done some tiling at a resort she worked at in Maine and at her Mom’s, so she had a pretty good idea what to do. As a result, we did it almost right. It turns out that tile is one of those things where almost is importantly not good enough.

Tile is very hard and durable, but it has almost no flex. This means it will work itself loose or/and break where a more resilient material would absorb and dissipate force. Therefore it really matters to get the underlayment smooth, firm and level. We knew this, so we popped for the special tile underlayment panels and screwed them firmly to the subflooring. The underlayment is basically a thin, hard sheetrock with a mesh matrix. It turns out to be a little tricky to get the screws all the way set into it. It’s a bit more than a power screwdriver can handle and forearms/backs start to cramp up after muscling dozens of screws. Like so many things about such jobs, if you did this for a living you’d work it out, but we don’t and didn’t.

We figured a slight screwhead protrusion here and there would not be a problem because they’d be small and buffered by the mortar. That turns out not to be reliably true. Over time any play there is in the floor (and there’s always at least some if your house is made out of wood) combined with traffic impact from the top uses those screwheads as fulcra to crack the tile, or failing that to wiggle and then seesaw it back and forth until it dislodges the grout and then the whole tile comes loose.

Spreading the mortar evenly is also a must. Any place where the mortar is thick or thin invites eventual problems. Still we might have gotten away with our screwheads if we’d gotten the polyblend additive for the mortar so it had some give rather than going right to crumble under stress. Thicker tile and smaller tile may also have compensated a bit. We used 12″x12″x.25″ tile, which was thin enough to break easily and big enough to offer a lot of offcenter leverage on each footstrike.

Knowing what I know now I’ll also be more attentive to the condition of the grout. When the tiles started to play it showed up in the crumbling grout first. It may have been possible at that point to scrape out the grout, inject some mortar under the tile edges, and regrout. Until it was too late that seemed like a lot of bother over a little cosmetic imperfection, but read on. [UPDATE: This technique did not work for me and I’ll be taking out and replacing a few more tiles over the summer. Seems to suggest it’s just worth biting the bullet and re-doing everything that’s even slightly loose all at once.]

OK, so eventually we had five or six tiles that were definitely coming up. So we took them out, walked around with attentive feet for a few weeks and ended up removing another dozen (out of maybe 120 or so total; like I said we did it almost right) that were showing signs of wiggle. At this point the real fun started.

Chipping up mortar by hand is an unpleasant task. I got a mortar chisel (big wide blade) which helped some, but some of the mortar wants to stay put no matter what. That’s what it’s for, after all. Plus kneeling on the floor pounding on a chisel with a hammer is not a recommended workout for thighs, hips and lower back. I did find, as is so often the case, that relaxing and letting the tool do the work was better than trying to muscle it. But for me at least that’s easier said than done. The bigger problem was that the adjacent pounding loosened up a couple more tiles. Eventually out of frustration I discovered that cutting cross-grooves in the mortar with a carbide-tipped scoring tool and then scraping it out with a grout scraper (I got the kind with the triangular carbide tip) was faster and less counterproductively violent. [Update: I’ve now got an oscillating tool with a chipping blade that looks very promising should this task arise again.]

So at this point we’ve got some open spaces on our floor ready to accept tile. Of course I went back and torqued down the offending screwheads. Some of the previous tiles were already broken, some more broke in the process of getting the mortar off them (which is just as fun as getting it off the floor), and of course the Home Depot no longer stocks those exact tiles. Given bad alternatives of approximate color-matching, tracking down remaindered tile, pasting broken tiles back in or digging up the whole floor and starting over, we decided to see if we could turn an embarrassing repair of a failed installation into a triumph.

We settled on smashing up the used tiles and using them for mosaic. The original floor is a checkerboard of reddish and tannish tiles, so we mixed the colors in the mosaics to locally fractalize the larger pattern. Doing the mosaic was a matter of sitting there with lots of tile shards of various shapes and sizes and piecing them together like a puzzle. That part was kind of fun. Once we had the mosaic laid out we took the pieces back out one by one, mortared them up (we used a premix acrylic mortar at this point) and stuck them down. By now my hips, back and thighs were getting downright blase’ about all the crabby work postures on the floor.

Grouting was no sweat in comparison, just more floor work. Over the last couple of weeks my feet, which are in full ptsd hypervigilance mode at this point, have found a couple more wiggly tile corners, so hoping I’d learned my lesson I promptly dug the grout out, pushed mortar under and grouted them back up.

Meatball tests the new floor

This is the main stretch

The main section from the other side

We like the subtle contrast of the grout and repainted the cabinets to match

Optimistically I think I now have a pretty good skill for the job — not just the brute instructions and techniques, but the logic and feel of them. Of course we have no plans for any other tiling in the foreseeable future. We’ve also gone that much farther toward turning our little suburban development starter house into something no one likely to buy such a thing will want to buy. We’ll cross that bridge when we come to it.

November 26, 2009

Entropy in the cul-de-sac

by Carl Dyke

I noticed this morning [yesterday, now] that the bathroom floor had collected enough schmutz to pass my action threshold. Leaves blanket our lawn and laundry blankets a corner of our bedroom. There are dishes in the sink and a bagful of student papers to read. The fish need feeding, the dog needs walking and the State taxes on one of our cars are due. Recycling was last night, and again in two weeks.

At moments like this I feel the grip of entropy most keenly. The little orderly systems of my life require the regular application of energy to keep from sliding down into chaos. Each time it’s worth it – the modest pleasures of a clean floor, a tidy lawn and an empty bag add up to a satisfying little life. Nevertheless, as I contemplate each outlay of attention and energy on doing that’s just going to need doing again, and again and again, the happy Sisyphus remains a tantalizing ideal.

In the classic The World of Goods: Towards an Anthropology of Consumption (1979), anthropologist Mary Douglas and economist Baron Isherwood argue that the periodicity of tasks is a primary marker of status. High-frequency, non-postponable entropic tasks describable as chores are the specialty of women, children, and servants. This is economically rational, they propose, in the way that any specialization is.

Thus, the division of labor between the sexes is set, the world over, by the best possible economic principles as follows: work frequencies tend to cluster into complementary role categories. These differentiate upward: the higher the status, the less periodicity constraints; the lower the status, the greater the periodicity constraints (86).

It follows that “[a]nyone with influence and status would be a fool to get encumbered with a high-frequency responsibility (86-7).”

No wonder I try to turn the entropic work in my life into rare and extraordinary events rather than daily habitual duties. The problem, I suppose, is that my sense of status does not match my class, as Weber might say. The classy thing to do would be to engage Central Americans to regulate my floor schmutz and tidy my lawn; start a grad program so there are intellectual strawberry-pickers around to grade my papers; and delegate the dishes and laundry to my wife. Too bad she’s an artist and has no more sense of vocation to keep the house up than I do. If only I had a real wife and not this impressive doer of awesome things! Maybe the two of us could marry someone else to do the chores for us? Or adopt a kid, an older one so someone else has already made the training investment. But, you know, kids these days….

October 16, 2009

Existential infinity

by Carl Dyke

I suspect that the ‘infinity standard’ is a dead, beaten and buried horse, but for my own amusement I have a ribbon to wrap it in. Consider this post collateral damage from a long commute alone with my thoughts during an NPR pledge drive.

To recap for convenience, in comments on the first post of the thread Kvond perceptively noted that “the Common Sense digestion of the guilt people feel for ‘not doing enough’ probably has very [little] to do with… an Infinity Standard. It probably has to do with letting specific people or models down that one feels they can’t live up to (not Infinite Models), and has to do with the prior, one might almost say, a priori establishment of subjectivity itself as a condition for guilt (at least in the West), a mechanism of storing up energies of self-infliction, much more locally organized and defined from any logic of infinity (real or imagined).”

I agreed that the subjective experience of an infinity standard was properly understood not as the product of a top-down logical argument from principles, but of a bottom-up accumulation of local obligations and their affective baggage. I think that’s how morals actually work; as Nietzsche, Wittgenstein and Bourdieu show in their various ways, systematic moral philosophies range from attempts to universalize local practices to reports on the fantasies of their authors. The feeling of infinity comes when the local claims on one’s moral action overload the buffer on one’s attention and energy, producing a paralyzing system crash. As I metaphorized it later in the thread, the resulting guilt effect is like “the shrapnel of moral artillery being fired by various competing communities tear[ing] into those of us with a sense of obligation to something larger than ourselves but no stable sense of what that might be.”

The key point is the locality of effective standards and obligations. Kvond reports feeling those local claims as dispiriting straight-jackets. Seen this way, the abstraction of infinity offers a liberating expansion of possibility. For any of us who grew up in tight-knit families, small towns or other relatively insular communities this argument is immediately evocative. Over-regulation can be a problem (corresponding to the “dualism/received knowledge” positions in Perry’s cognitive/ethical development schema).

But abstract infinity is only abstractly liberating, just as Marx argued in “On the Jewish Question” that abstract liberty is only abstractly liberating. In practice, Durkheim said, one must be regulated by a moral system that offers definite guidelines and goals, otherwise ‘it’s all good’ and ‘it’s all bad’ become equally available and equally unavoidable as floating judgments (corresponding to the “multiplicity/subjective knowledge” positions in Perry). Goffman’s warning against the tyranny of diffuse aims is on point here: when it’s not clear what the standards are, it can’t be clear what counts as accomplishment and an infinity of judgment is enabled.

We’re probably alright as long as we remain focused on personal liberation from a specific set of restrictive local morals, because they remain regulative even in their negation. Infinity looks like possibility from this vantage. The harrowing moment comes when we decenter our own locality and fully enter a world of multiple other local moral systems and agendas, each with equally coherent and valid claims on our attention and effort. Here the over-regulation is not coming from narrowness, but from overwhelming saturation. The syndrome is not claustrophobia, but agoraphobia.

As Neddy Merrill put it recently in quite a different context,

if we follow the ‘do the most good’ thought wherever it leads, we end up having really robust obligations that don’t leave room for our projects and commitments, e.g. friendships, hobbies, and so on. Or, in another version, the ‘do the most good’ thought leaves us alienated or estranged from our projects because of the way it prompts us to think of their value from the impartial point of view.

This is the question in relation to the trivially narrow yuppie quandary of whether to give money to Harvard University, and already it’s oversaturated. If we open the discussion up to all the possible wrongs that could be addressed by all the possible rights, any particular course of action recommended by one compelling standard becomes not just hopelessly inadequate by the plurality of standards but actively pernicious by other compelling standards. There are a lot of goalposts, they’re all a-wiggle, and the holder may not be on our team.

Be the target, Charlie Brown.

Be the target, Charlie Brown.

As wonderful as the internet and the world of blogging are for increasing our interaction density and enabling liberation from narrow, constraining provincialisms of practice, thought and ethic, that very same decentering dynamic potentially exposes us to an overwhelming multiplicity of compelling claims on our attention and energy, and potential judgments of our practice. The internet is just the most richly interactive of many modern media that not only delocalize us but then relocalize us in a much larger, more kaleidoscopic field of effective standards and obligations. Closing off or artificially limiting this paralyzing legion of ‘trolls’ and ‘grey vampires’, as a number of bloggers have done recently, is certainly one coherent coping strategy, and could suggest a relativist or perhaps merely multiplicity/subjectivist position in Perry’s old cognitive/ethical schema.

Perry suggests instead that we move to what he called “commitment:” “An affirmation, choice, or decision … made in the awareness of relativism (distinct from commitments never questioned). Agency is experienced as within the individual with a fully internalized and coherent value structure.” Yes, I end up saying, there are many other good things one might do, but this is the one I’m doing. Or as Weber said in his famous speech on politics as a vocation,

it is immensely moving when a mature man [sic]… is aware of a responsibility for the consequences of his conduct and really feels such responsibility with heart and soul. He then acts by following an ethic of responsibility and somewhere he reaches the point where he says: ‘Here I stand; I can do no other’.

The trick, I guess, is to be open to other people’s projects and even their criticisms of one’s own, without getting diverted into the swamps of Shoulds and What Ifs. It’s an infinitely open question where to draw that line.

October 6, 2009

Infinity and the ‘total institution’

by Carl Dyke

The reference was tickling the edge of my brain so I tracked it down. OK, cool – here’s what I meant:

Each official goal lets loose a doctrine, with its own inquisitors and its own martyrs, and within institutions there seems to be no natural check on the license of easy interpretation that results. Every institution must not only make some effort to realize its official aims but must also be protected, somehow, from the tyranny of a diffuse pursuit of them, lest the exercise of authority be turned into a witch hunt. — Erving Goffman, Asylums (1961)

The temptation is to look at this and say, Yeesh! Those dang institutions. Goffman’s more subtle point is always that these are things we do to ourselves.

August 26, 2009


by Carl Dyke

I just read Bruno Latour’s short essay “Why Has Critique Run Out of Steam?” It is a critical defense of facts against critique, motivated by Latour’s observation that the waste-laying weaponry of deconstruction has fallen into the hands of its enemies, who use it to cast doubt on global warming and to construct elaborate conspiracy theories about the CIA and Mossad’s connivance in the bombing of the World Trade Towers. “There is no sure ground even for criticism. Is this not what criticism intended to say: that there is no sure ground anyway? But what does it mean, when this lack of sure ground is taken out from us by the worst possible fellows as an argument against things we cherished?”

Latour worries that critical intellectuals are fighting the last war, that their aim is bad. Exposing the enemy misses the target when everyone is already busy running around pulling masks off and pants down. If the bad guys’ certainties are unwarranted, what about ours?

In which case the danger would no longer be coming from an excessive confidence in ideological arguments posturing as matters of fact–as we have learned to combat so efficiently in the past–but from an excessive distrust of good matters of fact disguised as bad ideological biases! While we spent years trying to detect the real prejudices hidden behind the appearance of objective statements, do we have now to reveal the real objective and incontrovertible facts hidden behind the illusion of prejudices?


This is of a piece with Latour’s more extensive (and acerbic) dismissal of postmodernism in We Have Never Been Modern, but somehow this one triggered a different association for me. It’s been a long time since I read it, but isn’t this some part of Allan Bloom’s argument in The Closing of the American Mind? As I recall, it’s not that Bloom didn’t see the value of the marxian and nietzschean critical ordnance that enables the demolition of the eternal verities, but that he thought they were too powerful. In unskilled or inimical hands they leave nothing but scorched and salted earth, or at least fool kids trampling his lawn and having sex in his bushes.

I’m no more comfortable now with philosopher kings locking away the most powerful engines of human intellection than I was in grad school when I read Bloom. But from Dostoevsky to Bloom to Latour smart people keep making good points about what happens when you let everyone play with dynamite. All else being equal I certainly do prefer good sense to scorched earth. But what exactly is at stake? Wouldn’t it be just typical for intellectuals to overestimate the importance of ideas in the world?

April 15, 2009

Philosophy is an excellent thing

by Carl Dyke

Over at Edge of the West, in the context of one of the usual pseudo-discussions about what philosophy is good for (prompted by yet another of Leiter’s snarky shills for the discipline, apparently), a guy named Michael Turner just posted a long, fascinating comment explaining how he went from software engineering to (Japanese) technical translation to language philosophy; in the course of which he said this:

OK, so I’m interested in what meaning is, and how meaning happens, through language. Can you philosophers help me out? Which one of you do I trust? Which ones are, by contrast, measuring their value to the field only by citation index, which might only be an indication of how many stupid arguments they’ve been able to start by feverishly propagating misunderstandings?

This is far from the most interesting thing he said (John M. and Evan, this is our kind of guy), and of course it leaves out all the genuinely valuable things the philosophers we all know we can trust do, but I still had a good snort over it.

In another comment, Anderson kindly offers up this provocative quote from Callicles’ rant in the Gorgias:

Philosophy, as a part of education, is an excellent thing, and there is no disgrace to a man while he is young in pursuing such a study; but when he is more advanced in years, the thing becomes ridiculous, and I feel towards philosophers as I do towards those who lisp and imitate children.

One might say the same of the study of history, or any of the humanities.

December 23, 2008

Wanted: Prof Whisperer

by Carl Dyke

A couple of remarks by Profacero here and olderwoman at scatterplot are coming together in my head with many such from over the years, to the effect that establishing authority in the classroom is a different challenge for women, race/ethnic minorities, and other stigmatized groups than for white men.

This is now an orthodoxy in the liberal academy, so like all orthodoxies I’m going to try to trouble it here. But it’s also true. It’s undeniable that since Columbus us white boys enjoy an entry privilege as authority figures, especially if we’re ruggedly handsome, brilliant, charismatic and naturally great-smelling like me. A big chunk of this is visually inherent as a function of habits of symbolic ranking and emotional identification. It’s also undeniable that for some fractions of our audiences only white men will do as authority figures, as the underbelly of this last election showed well enough.

It’s important for navigational purposes to understand where these structural reefs and shoals are, but agency at any particular moment is about where we can go, not about where we can’t. Dynamiting Scylla and Charybdis is a worthy project for special occasions but trying to do that daily will wear you out quick, which is one of the worst compounding effects of deprivileging. So in a practical, quotidian sense the question is how authority works under less-than-ideal conditions.

Here I think it’s helpful to come at the question a little bit sideways from the usual focus on qualifying privilege and disqualifying stigma. Things look pretty desperate from that standpoint. We see white guys living it up in the lap of esteemed luxury and ‘others’ struggling, and it looks like the single effective variable is whiteguyness. Looks like we’re stuck with the exhausting dynamite campaign. But wait – what do we do with all the white guys who struggle in the classroom? And what do we do with the race/ethnic/disabled/women/etc. who get in the classroom and kick some ass, without blowing up everything in sight or even breaking a sweat? Don’t we all know some of each of those? Maybe it’s possible to factor out the structural race/gender variable and get comparable positive and negative results across categorical populations! Jeepers, a playground for agency!

The problem with how these discussions go is that they tend to be informed by a lot of reciprocal ignorance and mythology. It’s well-established at this point that hetero white guys don’t know squat about what it’s like to be black/female/queer/etc. We drift around in a happy daze at the gravitic null-point of all social stratifications, unburdened and oblivious to the burdens of others. And relatively speaking, which is all I ever do, this is true. But as Goffman tells us at some length in Stigma: Notes on the Management of Spoiled Identity, it’s also relatively speaking false. The ideal, unspoiled, unstigmatized identity imagined by disgruntled white-guy voyeurs is a mythic construct not embodied by any real person (this is, for example, the founding joke of “American Dad”). Being a white guy helps a lot in some ways, but it looks better from outside than inside; and if you’ve never been one, you’ll have to take my word for that. We’re all vulnerable in big ways and small, Goffman says (Foucault agrees), and each social interaction is the opportunity for anxious and reciprocal attempts to deploy/negate strengths and conceal/discover weaknesses.

Students looking for an edge against a professor just bump on down the checklist until they find something that will work for them. Race, gender, ethnicity, sexuality, disability make things easy, but it’s a poor strategic interactant who stops there. Nor is whiteguyness much help, after the moment of entry, when there are other white guys around to cancel that advantage out. We’re a dime a dozen, and when we think things are at stake we rip on each other something fierce. We know each others’ weaknesses. So when white guys succeed in the classroom, it’s helped us at first to be white guys but then it almost instantly hasn’t, and we’ve had to deploy some other strength. What is that?

As olderwoman perceptively noted, classroom success comes to those who “carry privilege, a presumption of competence and authority with them into the classroom.” This is the ‘other strength’ that intercepts the stigma game. She ascribes this to upper-class white men, but notice that what’s being described here is not categorical identity but what Bourdieu calls disposition: an acquired scheme of perception, thought and action. Now, categorical identity is still significant because the dispositions of competence and authority are native products of the rich white boy habitus, and are interactively recognized as such. The nature of white boy privilege is therefore a kind of symbolic capital that is enforced through symbolic violence or its threat. It is in this sense that olderwoman is entirely correct that “[p]eople whose status is unquestioned can afford to be Mr. Cool with students,” because the threat of symbolic violence is understood and gratitude for its forebearance is ritually extracted. And this dynamic is what allows symbolic capital to be converted to economic and social capital, in the form of access to careers, advancement, esteem. Thus structure is produced and reproduced in everyday relations.

If we let it. Here’s where I agree with Marx that our conscious human history has not started yet. The dynamic of dispositions and habitus I have described above does not take us very far past the pack behaviors of dogs. In this connection it’s fascinating to watch the Dog Whisperer. Like the Nanny with children and parents, Cesar Millan’s whole insight is that when subordinates are getting unruly it’s not a follower problem, it’s a leader problem. The show gets old quick because it’s always the same schtick – come into a house, find owners fretting about ‘problem’ dog, discover the dog’s just confused about who’s in charge, train owners how to be in charge. Bingo bongo. And the real problem quickly emerges: white, black, man, woman, straight, gay, lotsa lotsa people have no idea how to be in charge of themselves, let alone others, even just dogs!, and anxious yapping ensues.

Cesar teaches the acquirable big dog skills of authority and competence to folks who for one reason or another perceive, think and act like little dogs. That is, he backfills the dispositions that make white guys winners in the big everyone stigma game, and alpha white guys winners in the little white guy stigma game. There’s nothing magical or mysterious about it, either. “Cesar counsels people to calmly, assertively, and consistently give their dogs rules, boundaries, and limitations to establish themselves as solid pack leaders and to help correct and control unwanted behavior.” That’s what the Nanny says about dealing with kids too. That’s what Obama did in this last campaign.

Calm assertion; clear, consistent boundaries. Not grand gestures, not puffery, not loud yapping. Those say ‘not trusting my own authority and competence, overcompensating’. Not negotiation, pleading or resentful disengagement. Those say ‘power vacuum here, please fill it’. Cesar thinks everyone can learn this. I hope so, because I don’t think we get over these pack-power games and get to human together until we do. And until we do, all of those categorical accounts of why things aren’t going right for us, even when they’re true, are little more than theodicies.

December 21, 2008

Kool-Aid cocktails

by Carl Dyke

I’m still chewing on the conversation at the earlier lumpenbourgeoisie post. Profacero remains firm that whatever merits academic employment may have cannot justify the poor pay. She keeps the high expectations and high self-subsidized costs of our work in view, with specific examples like research and conference expenses, adjunct stints at less than a living wage, crushing personal debt. This is all real stuff. We have no dispute about what actually happens. All of it has happened to me and many people I know, although I am somewhat insulated more recently from some professional costs by the relatively low formal scholarship requirements at my nice teaching-oriented regional slac — which means gaps in the cv that, along with my status as a tenured associate professor, pretty much take me out of play on the market and bind me to this job.

Profacero would also like to be able to afford a small boat. I wouldn’t have minded being able to afford my divorce, which despite everyone’s good intentions cost nearly twice my annual salary. Other colleagues have aging parents to provide for. Ponies are always nice. These things are relative, but the point is that we’re not paid enough to afford many things we might reasonably need or want. And at many places the belt is tightening, as Dr. Crazy discusses in an incisive post following up on others by herself, Historiann, and Tenured Radical, with whom I completely agree. Of course there’s also much to be learned and pondered about conditions and compensation for academic work from Lumpenprofessoriat, e.g. here, and What in the hell…, e.g. here, and Marc Bousquet at Brainstorm, e.g. here.

I’m all for doing what’s possible to enhance conditions and compensation for work, for everyone. I’ve argued that there may be costs along with the obvious benefits to academics specifically for resorting to unions to do that, just as there are costs and benefits to pulling a gun in a bar fight or putting Pavarotti on the jukebox at a party. The situation inevitably gets structured in a certain way you may or may not like when you make those moves; it would be good to consider alternatives. I’m a real fan of the aikido ethic, but to my knowledge we’ve not even begun to think of how something like that might apply. I’ve also argued that dire though the plight of tenured/tenurable faculty might be, for whingeability it doesn’t sort real high on the priorities compared to other folk with genuinely crappy lives, ranging from permanent adjuncts to some of our support staff to starving Haitian babies.

OK, so what’s this post about? It’s about ‘drinking the Kool-Aid’.

I’ve been arguing that whatever influence we may or may not have over the material realities of our employment, we completely control our attitudes toward them. We get to choose how we think (and, to a lesser degree, feel) about these facts we all agree on. We become what we pay attention to, as Mead and the interactionists say. Or Nietzsche: “And if you gaze for long into an abyss, the abyss gazes also into you.” So while we ponder available remediation or transformation strategies, we also get to direct our attention, think and be ourselves in the now. And I’ve remarked that in the context of this particular now, given the available alternatives, I’m pretty pleased to be drawing a comparatively decent salary to be doing work I notice is personally and relationally affirming. Profacero thinks that I’ve drunk the Kool-Aid, this is a delusional form of pathetic sacrifice, and “they’ve got” me where they want me.

Perhaps. I agree that sacrifice is pathetic, but what I do doesn’t seem like sacrifice to me. “We are all conformists of some conformity or another,” Gramsci said. It’s not whether you’ve drunk Kool-Aid, it’s which Kool-Aid you drank. You can drink the angry, alienated Kool-Aid or the woeful, victimized Kool-Aid or the contented, peaceful Kool-Aid. These are all interpretive stances. None of them are more or less ‘true to life’, and none are inconsistent with working to make things better, but the latter will take some of the sting out of your day. What we do has value; or at least, it’s what we do. This is Existentialism 101, “The Myth of Sisyphus.” Our fate belongs to us. The struggle itself toward the heights is enough to fill our hearts. We can be happy.