Archive for ‘feverish misunderstanding propagation’

May 9, 2021

Making pie

by Carl Dyke

My grandma was famous for her pies. (I don’t actually know what that means. I never saw her on tv or anything, but there was a family lore to this effect.) I got to eat Grammy’s pies from time to time, and they really were yummy. She was from New England, which helps except when it doesn’t. And she had the knack of crafting a richly fatty crust that was still light and flaky, and she selected and prepared the fruit so it popped and snapped and crisped and smoothed to the tooth as it should, and she deftly balanced sweet and tart for a fulfilling burst of contrasting taste and texture sensation in every bite.

Mom was (is) also great at pie, so I grew up with pie as a kind of family heritage. Sometime in my teens, I don’t remember the details, I decided this meant I ought to figure out how to make pie. And so I did a little asking, and a little watching, and a little noticing, and a lot of reading in the canonical texts (especially Joy of Cooking, I believe), and I baked a pie. And then a few more, but they’re not the story. Like a lot of things I learn I sort of did pie and then was done with doing pie.

As I recall and therefore assert, that first pie was not at all bad. “Edible,” as we say in my family. Because making a pie that’s not half bad is well within the reach of someone who’s been around good pies and knows how to ask, and watch, and notice, and read.

Of course my first pie was not up to the standards of the family masters. I don’t suppose many pies in the world were, or are, up to those standards. I’ve had a lot of pie over the years, including some very good pie and some pie made by people whose business it is to make pie, and I’ve never had anything better than Mom’s or Grammy’s pie. How could my first pie have been that good? It couldn’t. But it was fine, a perfectly decent and edible pie.

Sometimes I’ll get into discussions at school, or in the world, about what it means to learn and to know things. You’d think we could just agree that knowing things is cool, and that asking is helpful, and watching is helpful, and noticing is helpful, and reading is helpful, and experience is helpful, and that there’s usually more to know, and that knowing more is generally better than knowing less. But there are plenty of folks who will adamantly deny that there’s any knowledge advantage to focused study and years of experience, or that it’s possible to know anything from “book learning,” or that leaning in and paying attention is important, or even that knowledge of any kind is of any value at all. They’ll actively resist knowledge that doesn’t come in the right look and feel and delivery system. They’ll get into goofy ranking games about who is a good person based on what they know and how they know it.

I … don’t care? I think it’s pretty cool you can make a decent pie if you pay attention and do some reading. And I think it’s pretty cool you can make even better pies if you keep making pies and just get better at making pies.

Advertisement
March 22, 2021

The nightmare of dead generations

by Carl Dyke

When I was a young intellectual in college I was really excited about the power of ideas to change the world. Just as the ruling class thinks rules are very important and the working class thinks work is very important and mothers think nurturing is very important and warriors think violence is very important and pigs think mud is very important, eggheads think thinking is very important. Sometimes you just have to smile and nod politely when people say their little thing about how the world works. It takes all kinds.

So a popular idea about ideas in coffee shops, brew pubs, and philosophy departments is that ideas make things happen. This idea should die the first time you try to make it come true, but it’s easy enough to avoid this confound if you only ever have ideas and never try to make them come true. You can also play all sorts of association games, where things were happening and meanwhile people were having ideas that sort of resembled the things that were happening, and so there we have it, ideas making things happen. Never mind the same sorts of ideas were around in other times and places without kicking up much dust. This is really easy to see with, say, Nazism. Nazism itself is a portmanteau of all kinds of garbage that was around in European thought and practice for at least hundreds of years. Finding Nazism precursors is like shooting ducks in a bucket. It didn’t just go away when Hitler blew his brains out in the bunker, either. Naziform thinking is as routine as lawns and registered animal breeds and thinking it means something to be from Pittsburgh. Japanese Buddhists of the early 20th century were basically Nazis, as it turns out, because they were zeeeennn with obeying orders and committing genocide. Yes, bonsai is a little creepy. You know what’s completely normal as part of the distribution of human thoughts and feelings? Control freaks and motivated reasoning and being just fine with other people’s suffering.

The historical challenge is to get, say, Christianity, or communism, or free markets to be ideas that map cleanly onto the accomplishment of enumerated goals. If two thousand years later we’re still having trouble with loving thy neighbor, it might be the relationship between the Christian idea complex and the world of living and doing stuff is importantly indirect. If you actually want to change things, it’s incumbent to see what else might have been going on that murderously intercepted living together in freedom in the great 20th century communist experiments. And every time a free market gets ‘captured’ by dynamics other than the free and fair pursuit of individual self interest, shaking a finger and scolding people for doing it wrong are on the silly side of sociopathic wokeness. Don’t get me started on the Constitution.

It is just impossible to find any ideas in, say, 18th century Europe that didn’t have slavery and white supremacy as part of their effective context and content. Sometimes this is explicit, as in the case of pro- and anti-slavery tracts. More often it’s just that nothing about Europeans’ world in the 18th century was possible without slavery and white supremacy, although a lot of it had been true on other grounds some hundreds of years before. So you couldn’t say “I like a good ham sandwich” without saying “I like a good ham sandwich where cuisines and standards of quality in foodstuffs are marked out on a hierarchy with race at the bottom, and in a political economy that conditions their availability to the likes of me,” where that political economy included the labor and status and price and finance and power gradients of slavery and white supremacy. I’ve started to talk about how ideas are emergent in systems. And, you couldn’t say “I want freedom for X,” without saying that at least for the moment you were ok with conditioning that freedom on the continued enslavement and domination of global brown peoples. And you couldn’t say “that’s not what I’m talking about,” the way philosophers do as the special way philosophers get to manage their own ignorance as if it’s deep wisdom, by sealing off their special kind of thinking from their context, without baking the context that allowed you to think like that right into your ideas as their essential contingent precondition.

But, I haven’t said anything unique about Europe or the 18th century or white supremacy, have I. This is just how ideas work. The bits and pieces of them are floating around all the time, everywhere (who doesn’t like a good ham sandwich or rank people or want the world to be just so), but the specific assemblages of them have a time and place. If they make anything happen, it’s because they’re in the flow with lots of other stuff in context. And so it goes.

Not that eggheads ever need any particular context to think like eggheads do! So another dopey thing the people you would expect to think like this think in coffee shops and brew pubs and philosophy departments is that ideas, being powerful, don’t have contexts but are just ‘in play’ all the time. So to pick an example out of a gagillion, we could continue to be interested in Hegel as a live option for how to think about things. And find to our disappointment that he’s not only a racist like everyone else in Europe in the 18th century, but also a little fuzzy and maybe not great on the question of complex systems and emergence. Like, it’s in some sense tautologically true that the world develops according to its logic, which is by definition rational. So there we have it, the progress of reason. But this may not be helpful if we’re interested in how systems shift more comprehensively, which would have to involve contradictions within the logic of history. So we’ve reinvented the Young Hegelians and we’re well into the 19th century now. And then Marx blows it all up with the 11th thesis. You can describe the world any way you like, according to whatever logic suits your fancy, but none of that means anything unless you can get in there and change things. Which, we mostly still don’t know how to do, not least because we keep casting our wistful glances at kludgy old ideas with racism or whatnot baked into them. Lather, rinse, repeat.

Being a practicalish kind of guy with limited time for other people’s descriptions of the world in various ways, I think it’s kind of fun Hegel might have been on the trail of complex systems, but it wouldn’t occur to me to try to get Hegel to be relevant to that discussion now. I’d start with Poincaré, who was just straightforwardly trying to figure out complex systems early in the game called complex systems. But I really wouldn’t start with Poincaré either, first because he’s pretty mathy, and second because there’s an intervening hundred years of smart people working that angle. I don’t look to Darwin for the state of the art on evolution either, although you can find plenty of dingbats on social media doing just that and heroically refuting 19th century science. So I might start with Prigogine, or the early systems theorists like Wiener, as my dear papah Dyke the Elder suggested I do back when I first got sick of chasing the magic of ideas and started poking around at what else was going on. But really I’d try to catch up on what people who study complex systems have been learning about complex systems over the last twenty, ten, five years. Which, is what I do. I don’t expect Hegel to come up anywhere in there, although there’s always some romantic who remembers freshman philosophy and thinks he can get hot dates by connecting the dots.

August 7, 2020

Why I won’t be using Zoom

by Carl Dyke

There’s a lot of personal detail in this post. I think it’s necessary, and also in my case pretty funny because I’m empowered to shield myself from the unfunny bits. But if you can’t be bothered I don’t blame you. The tl;dr is that for me and some other people, I reckon, the experience and performance of self is awkward in ways that make personal imaging technologies existentially confusing, disruptive, or even threatening. I don’t think I’m saying anything new here. But as we head into a technology-mediated school term because of the pandemic, I have this to add to the lore of video course delivery and the confounding diversity of human kinds.

My Grandma Liz famously disliked being photographed. This seemed odd to everyone else, because by many standards she was a beautiful young woman and a handsome older lady, with strong features and an intelligent gaze. The standard garbage folk diagnosis was vanity, but her frank discomfort with her own image ruled that out. Some of us chalked it up to the free-floating poisonous critical judgment that can emerge from the family talent for observation and fine discrimination. And certainly it becomes swiftly wearing for a smart, ambitious woman to be constantly reminded that for others she’s little more than a pretty face and a fine rack of lady parts. In any case this was nowhere near the only way Grandma was odd, as are we all, so we all got on with it. I don’t have Grandma’s figure (it’s probably for the best), but I used to get along pretty well with her and I’ve gradually come to believe we had something more permanent in common.

At some point when I was a kid, I remember being given to understand that Dad was concerned I might be showing signs of self-absorption. This was a pretty serious party foul in Dad-world so I installed it as a priority hypothesis to test in a life cobbled together out of experiments. I think the irony must have been lost on me at the time. There was plenty of evidence – I was pretty fascinated with mirrors, or really reflective surfaces of any kind. I looked at myself any chance I got, from every angle I could. Store and car windows were magnetic, personal video selfies before personal video selfies. It probably wasn’t quite obsessive.

Fortunately I was not self-absorbed, at least in the sense of vanity. The issue was not connection but disconnection. I was fascinated with the image because it was obviously ‘me’, but I couldn’t figure out how to get that to make sense. Every time I looked, every step and angle, this uncanny something or other I couldn’t find any way to identify with moved right along with me. In middle school I took the sewing version of home ec and for a few years after that I would buy thrift store shirts and custom tailor them for myself. Badly, which I knew at the time, but it wasn’t really the craft I was concerned with. I have no idea what I thought I was doing at the time, but I was trying, I now think, to get what I looked like to have anything at all to do with how I experienced myself.

Yes, I had a fedora phase. And this:

Felix the Superbeetle and cousin Lindsay

One of my girlfriends in college remarked that when she saw me walking across campus, “it” looked good to her. I was delighted! Yes, nailed it!

Was “it” like that for Grandma too? I have no object permanence to myself. I don’t fear death, because how would it be different? To this day, when I see my reflection in a mirror, in a photograph, or on video, my first reaction is “what the hell is that.” Every. Time. From one moment to the next, I have no damn idea what I look like. Obviously I get queer, and for what it’s worth I count normal as a genre of queer. I get the horror of being pinned into any of the categorical identities, and the further horror of having to inhabit them in self defense. I can really understand why some people automate their self-presentation with stereotyped hair and wardrobe constructs, and I’m sympathetic with the chaos that must break back into their lives when that presentation is disrupted. But when they expect it of me as well I draw the line. It’s not that I want to fight that battle, but I don’t want to live it either. I is the kaleidoscope you see (I guess?), for better or worse.

All of this is stuff I’ve long since learned to manage, or at least live with. The Carl-bot is a practiced performance in many settings, and lets me peek out around the edges of ritual and expectation to express my care in the ways I care to express my care. But the bottom line is that having or making an appearance is an active and chaotic and distracting process for me. It’s work, and adds to the multi-tasking burden of all the other chaotic feeds I’m getting from environments full of other critters like and unlike me commanding my attention in various ways.

Seeing that work reflected back at me in realtime is mesmerizing and awful. Thinking about it happening on all of the other screens is an infinite regress of confounding self-reflection. I know I can turn off my video. I’m not telling you a problem and I’m not interested in your solutions. I’m an adult, responsible, smart, and adaptable. I guess? What I’m saying is, this is why I won’t be using Zoom.

April 15, 2020

Self, echolocation, conspiracy theories

by Carl Dyke

Awhile back I tossed off a remark on a Facebook post that conspiracy theories are a form of echolocation. The host (Neuroanthropology, one of the very best pages I follow) asked me to say more and I gave it some thought, gave it some more thought, realized it had all gotten pretty unwieldy, and wrote this instead. As with most of my ‘pings’ I’m not sure it’s anything much better than a conspiracy theory, but maybe it makes a good blog post:

I’ve been trying to figure out how to answer this without getting too far down the rabbit hole and ending up reinventing the whole history of contemplation. So by way of sketchy sketch, what we call ‘self’ is pretty clearly an emergent, adaptive epiphenomenon of environmental, biological, and cultural feedback systems churning along at various scales. Because it’s dynamic, relational, and adaptive, there’s inherently no stable essence to such a structure. It only persists by active (massively active) engagement with its surroundings, whatever they may be from time to time. This is an energetic process obviously subject to resource constraint.

Adaptation and evolution create a distribution of strategies within this basic dynamic. Interaction is split off into subsystems that operate at different rates and intensities, both within and among ‘individuals’. Resources are differentially committed and optimized around particular interactive settings. For example, it seems that people have various relatively hard wired rates at which learning occurs, with characteristic advantages and disadvantages to slow or swift response to new information.

Again, the dynamic interactivity of self means that its maintenance requires constant orienting feedback with and from the environments, internal and external. This is the echolocation part. But resource constraint means that we can’t be operating active echolocation in every subsystem and every scale simultaneously, and adaptive differentiation means we’re optimizing and prioritizing those feedback loops across a range of strategies. Practically, this means people are going to be active and maybe even ‘needy’ around a range of interactive domains, giving off and taking in information asymmetrically across multiple axes, none of this chosen or conscious obviously.

“Who am I” is a much harder question to answer and keep answered in interactively chaotic environments than homogenously stable ones. Environments produce a range of echoes, and processing biases reward different collection routines. It may be that for some people sometimes, somewheres, the mismatch between their pings and the available echoes is profoundly alienating, if not literally crazymaking. You would expect these distributional experiments out on the long tails, and you would expect those tails to get fatter as environments become more variable and chaotic. You would expect people to become more aggressive in their attempts to create and manage congenial echo chambers.

Conspiracy theories then work as a special case of a very ordinary kind of echolocating ping, by broadcasting a strongly biased signal into a chaotic environment likely to generate a loud and clear response one way or another. Although this feedback loop is likely to be identity and community defining, it’s not in the first instance about ‘believing’ the conspiracy theory at all.

April 5, 2020

Dogma and criticality

by Carl Dyke

“I always believed that two masterpieces (I say this very seriously) summarized the thousand-year-old experience of mankind in the field of mass organization: the corporal’s manual and the Catholic catechism. I’ve become convinced that it is necessary to add, though in a field that is much more restricted and of an exceptional nature, prison regulations, which contain true treasures of psychological introspection.” Antonio Gramsci, Letters from Prison I, 97.

Critical thinking is hard. I work in a humanities / social sciences / liberal arts field where we celebrate but seldom practice critical thinking (and perhaps ‘wisdom’). Much more often what you get is some “critical theory” or other reduced to a kitsch algorithm and “applied” as dogma. This is obviously a mistake, but it’s also not at all a mistake. So I’m writing this post out of frustration, but also, I hope, critically.

“Criticizing,” in the sense of finding fault and locating errors, is always available, but it’s the thinnest possible understanding of critical thinking. You’re finding mismatches between one set of standards and another set of practices, which in a lumpy, complex world is like shooting ducks in a bucket. (I started out by doing that here. I’m still doing it. I’m trying to do better. It’s hard.) People start to get good at this kind of critical thinking around the age of two. “Critical theories” at this level simply provide the more or less elaborate standards in comparison to which practices can always be found wanting. Don’t do that thing, do the other thing. “This ham sandwich is not the platonic essence of the ham sandwich.”

I should say that I often revere the standards provided by critical theories. In my experience the platonic essence of the ham sandwich was produced by the Good Food Bus, parked beside Paley Library at Temple University, circa 1983. It was glorious. I’m also in favor of flourishing, human and otherwise, and firmly believe that mean people suck.

What “critical theory” even at its most algorithmic gets right is that criticality is edgy (fractal, as Nietzsche said, and therein lies the abyss). The center of any practice is never where the critical action is, as any bored suburban teenager can tell you. In those centers there’s just a way things work, and part of how they work is by locking out critical disruption. At a systems level all of the metaphors of mechanism and organism suggest themselves here. Engines and hearts don’t do a lot of critical thinking about how and whether to make the old crate go, and that’s for the best. It’s never a good thing when those subsystems ‘go critical’.

This sense in which locked in, algorithmic regimes of ‘normal’ enable a smoothly successful function of business as usual is incredibly, one might say critically, important. This is what critical theories do for communities of like-minded intellectual practitioners, and why they’re critically not critical. As Marx himself noted, there’s nothing less critical than “Marxism.” The road from theory to cult to cultural system is paved with good intentions and sound practical reason.

All of this makes both practicing and teaching critical thinking really tricky. Anything we tell the students is dogma. If we drill them in it they are foot soldiers, or prisoners. If we tell it to them charismatically and they are moved to embrace it, they are disciples. If we show them our work, they become priests. To enter criticality and think critically, they must somehow evade the syllabus and the curriculum with which we tell them things, without just falling back into a prior dogma. The ones who get this routinely drop out, leaving the priests to reproduce their orthodoxies on the next generation.

In complex systems, such as all of our doings together, criticality is the turbulent edge between order and chaos. It’s a creative but dangerous space. Critical thinking is hard precisely because you have to suspend and disrupt the algorithms, entering criticality and exploring the possibilities that become available there, with the settled order of dogma behind you and the wild chaos of nonsense churning all around.

May 15, 2019

Imperial disciplinarity

by Carl Dyke

One of the interesting things you learn if you hang out with disciplined people is that although they understand there are other disciplines which do some stuff or other, they generally think their discipline is the master discipline. So to take a small subset of examples, people in the Literature discipline tend to think of everything as literature, and people in the Philosophy discipline tend to think of everything as philosophy, and disciplined historians will point out that everything is or is becoming history. And physicists think everything is physics, and engineers look at the world as a series of engineering problems, and lawyers always gotta be lawyerin’, and so on and so on. Of course they’re all right.

The funny thing about people who are disciplined in this way is that they think their discipline, as the master discipline, is already interdisciplinary. History is something literature folks obviously pay close attention to as they examine the writings of, set in, and about the past, so really they’re also historians, and historians are just slightly confused adjunct literature scholars who, if we’re being collegial, are just focused on reading and interpreting somewhat less interesting sorts of texts.

We can assume no actual person actually thinks any of this nonsense and that I’m just ranting and waving my hands in my usual undisciplined way. In any case we’ll call this straw man imperial disciplinarity. And I think imperial disciplinarity goes a long way toward straw mansplaining the routine inconvenient fact that no matter how much people in the academy say they’re excited about interdisciplinarity, which is a lot; with no durable exceptions I’m aware of (and I’ve been paying attention to this question for the better part of forty years now), interdisciplinary efforts predictably fizzle out. Because you yourself are already splendidly interdisciplinary, and your discipline is the master discipline that embodies Education, Culture, Rigor, The Liberal Arts, Science, Knowledge, Wholeness, Purity, The Good, and The Fresh Scent of Newborns. And why would you want to work closely with people who when it comes down to it are just narrow and deluded subset knockoffs of all that or, like, wrong?

But there’s another inconvenient fact, which is that the presence of physics in literature (and vice versa) does not actually confer any particular knowledge about physics. Or history, or philosophy, and vice versa. So while we may imagine that the storm in “King Lear” fictionally performed according to correct meteorological dynamics, reading or better seeing or better yet acting in “King Lear” gives one exactly zero substantive expertise in meteorology. Of course! This is not a flaw. But the literature of physics is in math, isobars and gradients and whatnot in this case, and math is the discipline those of us who like to read books took lit classes to avoid. Point being, as soon as the substantive expertise of a field is in play, the claims of each of the imperial disciplines are revealed to be just plain silly, and embarrassing. And then the abyss of just slightly less than infinite ignorance yawns and says, time to wake up?

Which is why, again, no actual person actually believes any of this. Because if anyone did believe this they would have sealed themselves off from thinking too hard about what it means to be educated, which of course is not just to be disciplined but to have at least a conversance with disciplines other than one’s own. Enough at least to understand in broad outline what those folks are up to, that it’s wicked important stuff, and how it is very definitely not just a narrow and deluded subset knockoff of whatever your thing is. Enough to take interdisciplinarity seriously as a primary educational imperative and the lifelong commitment of educated people rather than brushing it off as something you already do and a kind of optional ornament to a proper master disciplinary training.

August 27, 2018

Memory work

by Carl Dyke

Recently I bought a load of driveway gravel from a local landscaping yard. The guy was an efficiently skilled tractor operator so it was the work of five minutes to get two buckets of mixed gravel and base into the bed of the pickup. For the next hour I leaned on his loader frame and he talked his thoughts and world at me.

Not surprisingly a lot of it was paranoia and racism. I learned loads about how hard it is to make a living in landscaping when your competitors are undocumented immigrants who can bid jobs without factoring in the costs of bonding, insurance, and taxes. I learned that some tractor sales and service companies will deliberately sabotage your machine to make a buck on the repairs, and that when your equipment goes down in the middle of a job you have to pay top dollar to have it seen to right away. I learned that if you buy your cars from the same dealer over a period of years they start to take you for granted, and that they’ll deliberately delay a repair until you’re out of the warranty window. I learned that you can admire and remain friends with people who do you this way. I learned that 98% of us white men voted for Trump because he says the things we aren’t allowed to say. I learned that bush-hogging is a terrible job because who knows what equipment-destroying solid objects are hidden in that underbrush you’re clearing, and that no one wants to pay you for this risk. And I learned that it’s much more efficient to shift piles of material by pushing and pulling with a blade than by picking it up and putting it down with a bucket.

Somewhere around the hour mark it seems to have occurred to this fella that I might have somewhere else to be, and I allowed as how I might want to get to work soon. He asked where that was, and I told him teaching History at the local university. So then we had to have the conversation where he told me everything he thought about education and unpacked his own history as a student.

His most vivid memory was high school English class in eastern North Carolina, tobacco country, in which he did a lot of what he called “memory work.” The chalkboards on three sides of the room would be filled when the students came in, and they were to memorize all of it. He mentioned in particular the Beatitudes, selected Corinthians, and Shakespeare. There was drama over his recitation of the Beatitudes, which he attempted three times without the teacher marking credit in the grade book. This was an exercise of arbitrary authority, but also completely normal and just to be expected.

I remember doing a lot of memory work in Italian school when we lived there in the early 70s. I don’t remember what, exactly. I have a phone in my pocket now with 32 gigabytes of memory, enough to store every bit of text ever produced by the human race until we started texting “‘Sup?” “Not much” at each other by the terabyte.

Here are the Beatitudes, from Wikipedia:

Blessed are the poor in spirit: for theirs is the kingdom of Heaven. (Matthew 5:3)Blessed are those who mourn: for they will be comforted. (5:4)Blessed are the meek: for they will inherit the earth. (5:5)Blessed are those who hunger and thirst for righteousness: for they will be filled. (5:6)Blessed are the merciful: for they will be shown mercy. (5:7)Blessed are the pure in heart: for they will see God. (5:8)Blessed are the peacemakers: for they will be called children of God. (5:9)Blessed are those who are persecuted for righteousness sake: for theirs is the kingdom of heaven. (5:10)Blessed are you when others revile you and persecute you and utter all kinds of evil against you falsely on my account. Rejoice and be glad, for your reward in heaven is great, for so they persecuted the prophets who were before you. 5:11-12

June 25, 2018

History of the essence

by Carl Dyke

This is a thing for the History tribe right now. Maybe worth talking about, maybe not. From the open letter to the College Board (AP World History) by the Medieval Academy of America. I’m not linking because I don’t actually want to fight at them, I just want to roll around in a little disgust among friends.

“”By beginning ‘world history’ in 1450, the College Board is essentially sending the message that premodern culture and events are unimportant. It is impossible to make sense out of the political and historical climate of the mid-fifteenth century without a grounding in what came before. It is especially unfortunate to suggest, with the 1450 start date, that “world history” effectively begins with the arrival of white Europeans in North America, coupled with the mass extinction (chiefly through disease) of substantial segments of native populations. A pre-1450 start date would facilitate study of a global Middle Ages, a period when regions such as China, Mali, Ethiopia, Armenia, and Egypt had great achievements, in conditions of relative parity, before the oceanic dominance of a few western powers (Portugal, Spain, Holland, England, France). We have all seen how misappropriation of medieval history leads to the advancement of dangerous, racist narratives. Only education can counter such misuse of history. Teaching the reality rather than the fictionalized fantasy of the Middle Ages has never been more important than it is today.””

Good lord this is vacuous.

“”By beginning ‘world history’ in 1450, the College Board is essentially sending the message that premodern culture and events are unimportant.”

It is essentially sending the message that premodern stuffs are nonessential. Since there’s no absolute grounding other than complete and comprehensive inclusion for declaring particular histories essential, this is unremarkably true. What’s needed then is a claim about premodern stuffs being important in this context, not aggrieved partisan handwaving. Here it comes:

“It is impossible to make sense out of the political and historical climate of the mid-fifteenth century without a grounding in what came before.”

Sure! But it’s also impossible to make sense of what came before without a grounding in what came before that, so this is an inane infinite regress. We must start somewhere.

“It is especially unfortunate to suggest, with the 1450 start date, that “world history” effectively begins with the arrival of white Europeans in North America, coupled with the mass extinction (chiefly through disease) of substantial segments of native populations.”

World history may start billions of years ago, depending how you count and what questions interest you. Modern world history, where all the questions modern people have are inescapably located, does plausibly begin around 1450.

“A pre-1450 start date would facilitate study of a global Middle Ages, a period when regions such as China, Mali, Ethiopia, Armenia, and Egypt had great achievements, in conditions of relative parity, before the oceanic dominance of a few western powers (Portugal, Spain, Holland, England, France).”

Neat! Seriously, good stuff! Rock on with that, professional past knowers! Write books, articles, and blogs for all who become curious what happened long ago to discover and revel in.

“We have all seen how misappropriation of medieval history leads to the advancement of dangerous, racist narratives.”

We have? While we were at it, did we see anything about shady linear monocausal argument by assertion? If I said, We have all seen how dangerous, racist narratives lead to misappropriation of medieval history, how might you go about disentangling this elementary causal loop? Is this the quality of analysis we can expect from careful study of the Global Middle Ages?

“Only education can counter such misuse of history.”

This is a religious statement, likely false, and possibly completely false. But let’s keep giving education a try in case it starts working this time.

“Teaching the reality rather than the fictionalized fantasy of the Middle Ages has never been more important than it is today.””

And here, at last, we can agree.

H/t Colin Drumm.

February 8, 2018

People, bodies, characters

by Carl Dyke

Dyke the Elder recommended a fun book recently, The Infidel and the Professor by Dennis C. Rasmussen. It’s about the friendship between David Hume and Adam Smith and I’m looking forward to reading it. As we were talking about it I also thought of the book I’m using as the core text in all of my classes this semester, the Narrative, of a five years’ expedition, against the revolted Negroes of Surinam, in Guiana, on the wild coast of South America, from the year 1772, to 1777 : elucidating the history of that country, and the description of its productions, viz. quadrupedes, birds, fishes, reptiles, trees, shrubs, fruits, & roots; with an account of the indians of Guiana, & Negroes of Guinea. By Captn. J.G. Stedman. Illustrated with 80 elegant engravings, designed from nature, by the author, 1796. I am not an expert on this text! I’m teaching it so I can learn new things.

As you know, Bob, all sorts of interesting stuff was happening in the late 18th century Atlantic World around the universal themes of freedom, rights, and humanity. And for just as long people excluded from the universe of propertied white men have been pushing back on their degradation to the service of their oppressors. The mismatch between the high pronouncements and glittering achievements of the Enlightenment and the grim practices of the colonial slave economy that financed it is clear enough to us now that it may even seem it was clear to everyone then, too. A book like Stedman’s is interesting because it’s right in the middle of the ideas and practices we’re interested in, but isn’t the product of hyper-elaborated cutting edge high intellection. What did a guy who was pretty much just a guy think about, for example, the personhood of enslaved Africans?

Stedman was an interesting nobody, a low level Atlantic World cosmopolitan born and raised in the Netherlands but identifying as an Englishman, a brevet Captain in a Scots regiment who couldn’t afford to buy himself a higher rank, who shipped out to Dutch Guiana because that’s where the action was for an ambitious guy of talent but little social or economic capital like him.

He was a smart cookie but he was little educated and no philosopher, so it’s interesting to see how he thought about the leading intellectual issues of the day, or rather, how he didn’t. His book was published as an abolitionist tract, complete with gorgeously gruesome engravings of slave torture by William Blake. But Stedman himself was untroubled by slavery, which he mostly didn’t think about, but when he did thought was convenient and patriotic and probably good for the slaves all in all. What bothered him was excessive cruelty in the treatment of slaves, which he reports with outrage and ascribes not just to the Dutch, but to everyone else as well, especially the Jews.

The front matter of the text is full of conventional hyperventilation about his poor literary gifts and the advantages of authenticity and veracity this guarantees. But in the same breath he tells us that

Here, in the different characters of a Commander — a Rebel Negro — a Planter, and a Slave — not only tyranny are exposed — but benevolence and humanity are unveiled to the naked eye. Here the Warrior — the Historian — the Merchant — and the Lover of Natural Philosophy will meet with some gratification; while, for having introduced my private adventures, I must make some apology — but none for those of the lovely Slave, who makes not the least interesting figure in these pages — as female virtue in distress, especially when accompanied with youth and beauty, must ever claim protection.

So are there any people in this text? Well sort of — there are characters, literary abstractions, some of them people-based, some of them (tyranny, humanity, female virtue) more directly concept-based. He certainly elevates the lovely Slave by treating her as a princess and a lady, but does he humanize her? It’s a better character than barbarous Jew, that’s for sure, and also than domestic labor appliance, and perhaps than mere “black body,” as we now say to dramatize the degradation of African humanity within systems of oppression. But characters dehumanize everyone. Her character certainly tells us nothing about her as a particular person nor, as feminist scholars have thoroughly established around the princess and lady tropes in our day, is it likely to lead there. She, whoever she was, is nowhere to be seen, replaced by a damsel in distress. And when it turns out from Stedman’s diaries that he purchased her from her mom as domestic help, and abandoned her readily for a proper white wife (who he didn’t get along with and used for socially appropriate reproduction) on his return to Europe, and that he was just routinely having sex with all of the slave women when the mood struck him, there’s just not much human left under the romanticism to have anything like human rights, let alone all of the detail people have.

Compare all this to one of the first stories Stedman tells in the main text, in chapter 1. He’s on the ship taking him to Surinam, in the middle of the Atlantic. He’s just told us about some interesting sea birds and gunnery practice.

On the 14th, in the morning-watch, we passed the Tropic, when the usual ceremony of dunking the fresh-water sailors was ransomed by tipping the foremast men with some silver. About this time the Boreas most unluckily lost one of her best seamen, the boatswain’s mate, whose hand slipping by the wet, he pitched from the fore-yard-arm into the sea. His presence of mind in calling to the captain, as he floated alongside, “Be not alarmed for me, sir,” in the confidence of meeting with relief, attracted peculiar compassion, and even caused some murmuring, as no assistance was offered him; in consequence of which, after swimming a considerable time within view, the unfortunate young man went to the bottom.

The next paragraph covers trade winds and dolphins, which he thinks are superficially charming mooches.

Are there any people in this story? There’s the mate, the captain, and the compassionate murmurers. It might be more accurate to call all of these ‘roles’. Presumably the Captain, in his authority, made a cost benefit kind of decision between hauling the whole ship around and losing way vs. losing a boatswain’s mate, and found the latter loss more tolerable. In a split second of responsible decision this fine fellow finds his value, and it’s remarkably low. Glug, glug. Well, right about the same time you’d line up rows of guys like this a few paces apart and have them blast away at each other with muskets until one side or the other broke. Talk about bodies. That was Stedman’s world.

I think it’s fair to say that Stedman had nothing at all resembling an abstract theory of universal humanity, and so the discourse of dehumanization would have made little sense to him. People came in various characters, roles, ranks, types, uses, and situations. He seems to have been able to deal with them accordingly without making any conclusions about their further attributes or qualities, sort of like the Walmart checker and I do with each other. Killing rebellious Negros or any other sort of enemy was fine with him; making them suffer unnecessarily in the process was not. There was a person in those bodies, but for the most part he wasn’t concerned with who that was. In fact across the board, he seems to have thought that wasn’t any of his concern.

March 30, 2017

Fortuna’d son

by Carl Dyke

I just almost got myself into an internet fight with a deontologist.

Fortunately I kept my wits about me and took a powder. Nothing good ever comes of getting into it with the righteous and literal-minded. The occasion was a Facebook post on Erica Benner’s Guardian essay asking “Have we got Machiavelli all wrong?” Well of course ‘we’ have. She tells the familiar story of teaching Machiavelli the usual way, as a shill for power; but then starting to pay attention to all the stuff he says (especially in the Discourses on Livy) about freedom and citizenship and republican virtue; and finally realizing that all of the Prince stuff is framed by the other stuff as cautionary tales and instructions to a free people on how to spot and resist tyranny. This version of the argument obviously has Trump in mind. Of course careful readers have been having something like this epiphany for hundreds of years, not least Gramsci, as I have discussed at length.

The deontologist shrugged off the context and insisted on the text, where Machiavelli plainly says things about the exercise of power that are morally repugnant. QED. Machiavelli is the Disneyland of is/ought theorists. Never is it more plain that deontology (and its evil twin consequentialism) emerges from fundamental intellectual laziness. Morals do all the work of keeping things neat and linear, selecting out a priori all of the confounds. Nowhere to be found is any sense that the world is a manifold we stumble through with all manner of dispositions, habits, practices, heuristics, improvisations, reflexes, desperate gambles, selective ignorances, constraints, affordances, conditions, situations, assemblages, trajectories, strategies, roles, identities, networks, and whatnot before we ever get anywhere near ‘ethics’, the tidy parlor game of the mind. Take your shoes off before you walk on the carpet.

I am aware that there are myriad permutations and subtleties I am trampling upon here. It is my intention, nay, duty in life to never get drawn into any detailed examination of these. They have nothing to do with any serious business. With Machiavelli, we start with a person trying to make sense of and be effective within a lifeworld, a particular situation in turn of the 16th century Italy that constitutes and embeds him in particular conditions, dynamics, opportunities, threats, resources, and so on. He remains interesting because he takes a real crack at that, which means he has zero fucks to give about systematic ethics.

Nowadays we talk about real takings a crack in terms of complex dynamical systems. Machiavelli signals that’s what he’s up to, according to the available idiom, through the concept of fortuna. In my dissertation I talked generally about fortuna as contingency, following Pocock. But I would now translate fortuna and its conceptual partner ‘corruption’ into the range of complex dynamics covered by chaos, emergence, nonlinearity, and self-organization, arising respectively from broad historical processes and human relations more specifically. As an analyst, Machiavelli saw chaotic historical and interactive fields that defied linear causal analysis. As a strategist, he was looking for the stocks and flows that could be nudged toward emergence into a (meta)stable political order.

Here’s a characteristic orienting gesture, from The Prince chapter XXV, “What Fortune Can Effect in Human Affairs and How to Withstand Her:”

It is not unknown to me how many men have had, and still have, the opinion that the affairs of the world are in such wise governed by fortune and by God that men with their wisdom cannot direct them and that no one can even help them; and because of this they would have us believe that it is not necessary to labour much in affairs, but to let chance govern them. This opinion has been more credited in our times because of the great changes in affairs which have been seen, and may still be seen, every day, beyond all human conjecture. Sometimes pondering over this, I am in some degree inclined to their opinion. Nevertheless, not to extinguish our free will, I hold it to be true that Fortune is the arbiter of one-half of our actions, but that she still leaves us to direct the other half, or perhaps a little less.

So much is happening that is not and cannot be under any kind of direct human control. It would be reasonable to give up all hope for intentional action. But he sees free will as one constrained operator within a dynamic field, and on that limited basis it’s worth working out how to be more rather than less effective.

His solution relies first on the pre-stocking of all of the resources, conditions, and happy accidents needed to assemble the new order: a free and virtuous citizenry, custom, law, yes ethics, religion, institutions, checks and balances among the competing power bases. The configuration and interaction of these make up “the spirit of the times:”

I believe also that he will be successful who directs his actions according to the spirit of the times, and that he whose actions do not accord with the times will not be successful. Because men are seen, in affairs that lead to the end which every man has before him, namely, glory and riches, to get there by various methods; one with caution, another with haste; one by force, another by skill; one by patience, another by its opposite; and each one succeeds in reaching the goal by a different method. One can also see of two cautious men the one attain his end, the other fail; and similarly, two men by different observances are equally successful, the one being cautious, the other impetuous; all this arises from nothing else than whether or not they conform in their methods to the spirit of the times.

This is a rudimentary theory of inus (insufficient but nonredundant part of an unnecessary but sufficient) conditions. One size does not fit all. Not only are there many, path dependent ways to skin a cat, but the decision path is itself embedded in a larger dynamic field of supporting and thwarting conditions. Bloody messes are heavily represented in the possibility fan.

Anyone (say, a Prince) who wants to be something like intentionally effective has to orient themselves to existing flows, working with what is already working. Even then, it’s important to clean out as many variables as possible so that the various flows can be channeled together into a metastable, homeostatic order, actively maintained by continuing collective effort:

And this must be taken as a general rule: that never or rarely does it happen that any republic or realm is well-ordered from the beginning, or altogether reformed from its old order, if it is not ordered by one… but a prudent orderer of a republic, if he has this will to benefit not himself but the common good… has to arrange to have this authority alone; nor will a wise mind take issue with any extraordinary action necessary to order a realm or constitute a republic…. [But] if one is appropriate to order things, the order will not last long when it remains on the shoulders of the one, but very well when it remains in the care of many, and when it is up to many to maintain it. Because just as many are not suited to order a thing, due to not knowing its good because of the diverse opinions among them, so once they know it they cannot agree to abandon it.” Discourses, book I, chapter IX.

So what does it mean to be “Machiavellian?” It certainly doesn’t mean to focus on ethics, which are at best a strand of the larger analysis that involved him. I suppose if he thought you could get anything done with ethics, he would have been much more interested in them. But it also doesn’t mean simply being an amoral shill for power. Machiavelli liked republics and liberty very much – they are explicitly the end goal, and his life’s work. But he didn’t think there was anything easy about getting or maintaining them, and that lots of things had to line up to make them possible, none of them conforming to abstract ideals. It’s an old point, older even than Machiavelli, but it bears repeating in whatever ways the spirit of the times call forth. Nowadays we might say he didn’t have the privilege of focusing on ethics.

August 31, 2016

What the Universal Translator gets wrong

by Carl Dyke

One of the necessary little tricks in Star Trek is a device called the Universal Translator. What it does is something something something, and as a result all of the characters from all of the species and cultures in all of the galaxy can immediately and seamlessly hear and understand each other without having to labor over a lot of language acquisition. Handy!

In one of my favorite episodes of the Next Generation series of the show, the Enterprise encounters folks who speak entirely in metaphors (unless they are allegories, or even better, strategies, as Ian Bogost argues). Because the Universal Translator has no database of the original referents for the metaphors/allegories/strategies, it can render the words and names of the imagery but is confounded on their purpose and meaning. Frank incomprehension ensues. Brilliantly, the alien captain beams himself and Picard down to the local planet, where they are forced to work out an understanding under pressure from a belligerent prop critter (which makes no attempt to understand them, or vice versa). Even though Picard learns only a phew ‘phrases’ of the alien language, it is enough to stand down tension — although not to ground a relationship, so off the aliens go again.

So in effect, they get to the level of ritualized small talk, and like so many of our encounters in real life, that’s as far as they’ll ever get. What the Universal Translator gets wrong is that you could ever advance to understanding of another person or culture or conceptual complex just by translating the words without all of the unarticulated paratexts that give them meaning and purpose. This is also a thing that all of those lists of ‘untranslatable’ foreign words get wrong from the other side. No word is untranslatable, although sometimes it takes more than one word to do it. What’s tricky is all of the stuff embedded in the word that doesn’t come with it in the verbal substitution.

A good recent example is Paul Berman’s takedown of cross-cultural misunderstanding over French banning of Islamist dress and particularly the ‘burkini’, full-cover swimwear for women. Berman focuses on the French word laïcité, which is routinely described in American commentary as an untranslatable mystification justifying all manner of offenses against fundamental values like personal freedom. So Berman notes that the word is not at all difficult to translate, ‘secularism’. But what is hard to convey is the thick concept embedded in the thin word by the long history of the French working through all of its permutations in exhaustive public debates on the way to installing it as one of their fundamental national values. Not secularism, secularismSecularism, get it? Which means those women are not just exercising their personal rights of choice on those beaches, they are directly and explicitly attacking the French nation as such.

Berman does not go on to discuss how this kind of argument works pretty much the same if we’re talking about Americans freaking out over Muslim immigrants trying to get the schools to take Sharia law into account in dress codes and menu options and such, but if he did he might reasonably reply that in this context the fight is over which religious fundamentalism will dominate public spaces rather than its complete removal therefrom. And the profound differences between a liberal conception of serial diversity vs. a republican conception of compelling moral solidarity and a conservative conception of wholesome homogeneity.

What strikes me here is, yet again, that the same words can have not just different meanings, but completely different existential and conceptual underpinnings, different logics of practice to use a concept given meaning via Bogost, Bourdieu, and eventually Marx. Both the United States and France assert secularism as one of their core values. But it turns out that looks very different if you actually mean it. So what we’re seeing with France is an experiment in making secularism a finally deciding principle, rather than a nice bonus as long as nothing else important is at stake. In France, secularism grounds individual rights. When individual rights don’t express secularism, it’s the rights that must lose. In the U.S., individual rights can include secularism, but often don’t. Ours is the liberal secularism of not taking sides, in fundamental contrast to the republican secularism of defining a moral order prior to individual choice or group affiliation. But then, individual rights for us are themselves a fundamentalism.

All of this emerges from evolutionary histories. As Berman notes, France makes a lot more sense if we remember the religious wars that shredded Europe for a couple hundred years. Then the revolutions made the blood flow. Then primitive ethnic nationalism twice mixed blood and soil. Then the empire agonizingly collapsed in the blood rivers of identity politics old and new. France has tried out a whole bunch of extremisms, and is now extremely extremism averse. Well, except for the National Front, who don’t so much miss the lessons of history as proudly embrace their gruesomely formative slaughter. They relish the fight. They don’t want to ban burkinis, they want to throw the Muslims into the sea. To secularism fundamentalists, burkini bans are congenially available as a moderating response to that kind of extremism. Two birds with one stone. To rights fundamentalists, burkini bans are unthinkable, uncanny, horrific. Polluted and polluting. Their range of understanding and response are restricted accordingly.

Oddly, or maybe not at all oddly, I find myself in the same predicament in my sabbatical project, and especially trying to explain my sabbatical project. “A history of theories of complex systems,” I say. Most people know what all these words mean. They can use them creatively in ordinary conversation. After all, complex systems are all around us. I mention examples. It’s easy. A farming colleague has begun teasing me about how everything is a complex system. It reminds me of the old joke about Clifford Geertz, who after writing about ideology as a cultural system, religion as a cultural system, chickens as a cultural system, politics as a cultural system, and your face as a cultural system (ok, I made that last one up, but it’s plausible enough), was supposedly working on his magnum opus, “Culture as a Cultural System.” Haha, Carl and his complex systems.

But no, look, not complex systems, complex systemsComplex systems, get it? The difference is what happens if we start to take this seriously as a conception of the world. It’s not that things are complicated or that they can get unruly. It’s not about adding a variable or two to approximate a more complete analysis. It’s not that there are sometimes multiple factors and causes and motivations, and it can be tough to untangle them. All of that is sort of true-ish, but still completely missing the point. It’s taking complex systems and making them the rule, not the exception. It’s that actually, situations about which you could say the foregoing entirely truly are vanishingly rare and exceptional, and generally require massive inputs of effort and selective attention. The conceptual foundation of complex systems analysis is fundamentally alien and opposite to the way most of us have been taught to think about the world, which is in terms of isolating effective causes, and making shit up when that doesn’t work. God(s) did it, Fate did it, the Jews did it, The Man did it, men did it. Obama did it, Ike did it, Reagan did it, Hitler did it, Lincoln did it, MLK did it, Susan B. Anthony did it. Fertilizer did it, antibiotics did it, free trade did it, rational choice did it, the bourgeoisie did it, Bretton Woods did it, Socrates did it, Kant did it, Helen’s face did it.

Can you change your life by changing your diet? Sure. Will dropping red meat and eating yogurt with probiotics do it? Gosh, where to start. Maybe let’s talk about how we used to eat the stuff that the animals we ate were digesting. How paleo can you go? Yogurt, um. There’s a food system, there’s a culture of food system, family recipes and such, there are politics and economics, markets and climates, there are a lot of habits to talk about, there are billions of beasties doing a whole bunch of interrelated work in the soil you may not want to wash off that produce and in your guts, your guts include your skin by the way so let’s talk about soap and makeup and moisturizer. Will Donald Trump ruin everything, or fix everything? Which everything. Are we starting with checks and balances? Will we talk about the relative advantages and disadvantages of large and aging human populations? How exactly is he going to make the Mexicans do anything? How’d his first two marriages go? You say at least he tells it like it is? Gary Johnson, the candidate of choice! Jill Stein, for moral purity! I can’t even get there from here. Shaka, when the walls fell.

March 13, 2016

Another pointless exercise

by Carl Dyke

Whatever it is that academics do, it’s pointless. Down in Florida, the Governor is sure enough of this to heroically save the taxpayers their wasted dollars by defunding junk degrees like Anthropology that don’t lead directly to jobs. Here in North Carolina the rhetoric is the same, and the plan seems to be to squeeze funding for higher education until the juice of usefulness is extracted from the pulp of waste. Around the nation trustees drawn from the world of business select and then praise university presidents who talk about preparing their students for the world of business. Because obviously, if we’re going to be paying for education, it needs to pay off, and right pronto.

What I really think is that this is all part of a complex evolutionary dynamic incident to global flows of resources, capital, and labor; and ultimately, as with all things, the capturable energy of the sun. But because that kind of analysis is hard and not immediately entertaining, I’m going to talk about tribal spear-waving and questionable metaphors instead.

So back to defunding the higher educations, Peter Dreier isn’t helping. In a play right out of the now-venerable Postmodernism Generator he repeats the Alan Sokal experiment and gets himself invited to the “Society for Social Studies of Science and the Japanese Society for Science and Technology Studies” conference in Tokyo, with a paper on “the absence of absences” that is gibberish he has just flat pulled out of his butt. A little more absence in that paper, please. Dreier is a sociologist, so he thinks maybe some things academics do aren’t completely useless. But he’s not too sure about the other papers on his panel, with titles like “The Motility of the Ethical in Bioscience: The Case of Care in Anti-ageing Science” and “Agnotology and Privatives: Parsing Kinds of Ignorances and Absences in Systems of Knowledge Production.”

It further does not help that Dreier himself may have been (or might as well have been) meta-pwned by the burgeoning for-profit pseudo-academia industry. Globalization + (publish or perish) = shenanigans. I’ll mention here that I have seen no particular signs of rigorous curation at any conference I’ve attended over the past thirty years, in Tokyo or otherwise. Because how could they, really, and a conference must have papers like a dog must have fleas. So among other questionable uses of my time I have sat politely (if you don’t count the squirming and eye-rolling) through about forty-leven bright young literary scholars earnestly and interdisciplinarily telling me stuff they happened to notice about Mary Shelley’s Frankenstein.

Which brings me to my new colleague Cameron’s recent lyceum presentation, “Why Are We Comfortable with a Serial Killer on Cereal Boxes?: Frankenstein in Pop Culture.” I’ll get to what was good about my guy’s thing in a second, but by way of transition I must first remark that it was perfectly, gloriously, in every way (well, except no sneering righteous fulminations against the patriarchy, white supremacy, neoliberalism, the American empire, or what have you so sort of tolerable in that sense), exactly what the critics of academe have in mind when they cut every precious tax dollar they can get their righteously crusading gauntlets on from this useless nonsense. Charmingly and eruditely, in the best tradition of the Whatsis Critical Something Justice Cultural Something Studies that are the very first targets of the reformist backlash, and with Powerpoint slides including lots of hot babes, Cameron noodled his way through two hundred years of arbitrarily selected and completely uncontextualized pop culture in order to make the point that – what? I can’t remember, because one never does with these things. Pointless! And for this he’s going to get social acclaim and publicly subsidized lifetime employment in a job that is objectively one of the best humans have ever invented. Which he will then complain about. (Cameron himself, maybe not so much a complainer. But you follow me.)

Now we come to the turn. I won’t try to justify any of the Dreier stuff; it’s bad, and maybe systematically bad. There’s a lot about academe that not only enables but encourages charlatans, frauds, and hacks.

But I mentioned Cameron was erudite and charming, and he was. He also made no pretence that what he was up to was in any way immediately important or useful. It was, first of all, an interesting stroll around a landscape, indicating various notable features. Folks regularly journey to distant lands and pay thousands to professionally charming experts for this sort of pointless tourism. I think most everyone understands that the payoff of being herded around the sights is not some bankable return on investment. Our university lyceum, which is a public presentation, works very well when it’s that sort of tour. Our classes too, for that matter. Still, taxpayers don’t subsidize tourism (get it, I just made a funny) so I can see why this might not be good enough. And of course we don’t grade tourists (look, another funny).

So Cameron’s Frankenstein thing was a tour. So was my dissertation. So is this and many other blog posts. So were Dyke the Elder’s early papers on political philosophy, which I’ve tracked down and skimmed with great pride. He walks around the likes of Rousseau, indicating notable features.

But thinking about Rousseau or Gramsci or Frankenstein, yet again, again and again and again for crying out loud, Frankenstein again really??? works as a metaphor I like even better – a workout. When I run, I run in a circle. When I go to the gym, I can’t expect to end up somewhere after a half hour on the treadmill. When I pick up a weight, it’s only to put it back down again. I don’t notice the weight much, or remember it in detail. It would be silly to. Furthermore, in terms of immediate return on my investment of money, time, and energy I am not gaining anything! I’m getting tired and sore; I’m actually tearing my muscles down! I leave the gym objectively worse off than I came, not to mention the wear and tear on the gym equipment. The whole thing is a hugely expensive waste, just like the nth Frankenstein talk, Cameron’s and my and Dyke the Elder’s careers, and the whole liberal education racket.

(So here I’m going to interject that I don’t go to the gym any more. I always hated it; I did it for many years because, once you get past the short-term frustrations and degradations, you do in fact get stronger, more fit, more resilient, and, if you crosstrain properly, more generally capable. But now I live on a farm, which is full of physical tasks that work and stretch my body in the necessary ways. Living a life that naturally challenges and develops you is obviously preferable to going to the gym, and to school. Or so the Stoics said a couple thousand years ago. Those lives are not widely available, unfortunately, and as those mouth-breathers out in Oregon have recently demonstrated, are not automatically edifying.)

The point is that the weights and exercises are not the point. I don’t care about weights or treadmills as such. In the same way I don’t care if my students care about the finer points of distinction between National Socialists and Social Democrats (been doing a lot with Nazis this semester). I’ve heard and can make an argument that this would immediately make them better citizens, but to be honest I don’t think it’s actually going to change anything as such. Nazis certainly knew those points of distinction, at least to pass the test and crack ‘the right’ heads; that knowing was not automatically edifying either. I don’t expect Cameron cares much if the audience at the lyceum can still say exactly why pictures of conventionally attractive women showed up in a talk on Frankenstein. That connection he showed us how to make was just an exercise, a weight to struggle with for a second – put it down when you’re done, that’s fine.

The same politicians and businessmen who side-eye the return on investment of publicly subsidized education then complain to me on the tennis courts about how intellectually flabby and useless the college graduates they hire are. From my classes I know exactly who all these people are. They’re the ones who skipped the workouts.

P.S.: At this point we could talk about a ‘food for thought’ metaphor and fatty snacks. After all, even the most nourishing meal turns to shit by the next day. Circle of life, baby. Instead I’ll mention that I’m sorry to have been so long away from this blog, which I still love and treasure. ‘Buying the farm’ has chewed up a lot of bandwidth. In the meantime anybody who’s still following here and who’s wondering what I’m thinking about should friend me on Facebook (Carl Dyke, Methodist [University], Cameron North Carolina), where I do a lot of microblogging, and you might also be interested in the links I and my colleagues share on the Facebook Methodist University Department of History page. You can see lots and lots of farm pictures on Rachel’s Instagram, therachelherrick.

July 5, 2015

Making work

by Carl Dyke

Among other things, the unfolding drama in Greece is a reminder that in the world today, and for quite some time past, there is not and has not been enough work for people to do. Rather than find some other way to organize and valorize human life, the response to this has been to make work.

In Greece, among many other places, this has taken the form of massive systems of neo-feudal governmental and quasi-governmental employment (farmers of government payments like the defense and health industries, for example), funded through various extractive and inventive strategies ranging from taxation to money printing to ‘public debt’. (Since the fiat currencies of the modern state are essentially circulating debt, there is no essential difference between these strategies except the levels of public confusion and therefore the pseudo-politics caused by each.)

Because countries like Greece are not big and scary enough to control their own narratives, this fabrication of life and value is commonly referred to there as ‘corruption’. In countries big and scary enough to control their own narratives like the United States and Germany, it is referred to as ‘the public sector’. But in all cases most of the work in question deploys the otherwise unemployed to provide each other, at each others’ expense, with ‘services’ the need for which is largely created by their availability.

Clearly this is not ‘gainful’ or ‘productive’ employment, except in the pragmatic and existential senses that life and value are created by it. As the story goes, truly productive employment only occurs in the ‘private sector’, where the work is driven by real market demand rather than corrupt and/or unproductive shenanigans.

Which brings me to landscaping.

versailles gardens

Landscaping might be described as an inherently unproductive modification of land. Farms are not landscaped, they are worked. Lawns without sheep are an ecological monstrosity, and ‘yards’ need only be cleared enough to keep pests and predators from immediate contact with the buildings. Unremarkable local plants do the trick just fine with minimal inputs of effort. Spare land may well become valuable through garden planting, or left fallow. Of course flowers that attract pollinators, fix nitrogen, and the like may add splashes of color and texture. Productive land has its own beauty, as do the lumpy bodies of productive people. Human / land interaction is traditionally labor and attention intensive. Ordinary folks lived like this for millenia.

And yet, in the United States alone landscaping is an $80 billion ‘industry’. Some of this of course is public and quasi-public landscaping like government lawns and highway medians, but most of it is private and therefore market driven. There is a robust demand for landscaping.

The need to beautify commercial/residential property as a place for relaxation, entertainment or work, has long nourished the interest in landscaping. The worth added to the value of property by decorative structures, ponds, patios, and green-winding pathways too cannot be undermined. Keeping in view the growing popularity and importance of landscaping as an art, science, and commercial value proposition, it is of little surprise that landscaping services has now become one of the most important domains in the overall services industry.

From each according to their abilities, to each according to their needs. Over 800,000 people are employed in just the direct provision of landscaping services such as “sod laying, mowing, trimming, planting, watering, fertilizing, digging, raking, sprinkler installation, and installation of mortarless segmental concrete masonry wall units,” over 15,000 in “death care services” alone. This does NOT include the production of landscaping supplies and equipment, agriculture of sod and ornamental plants, industry and academic study of same, fractions of transportation, water, and sewer infrastructure devoted to moving the stuff and the stuff’s inputs and outputs around, yard ‘waste’ removal, and so on. Taken all together, it would probably be safe to say that private demand for the inherently unproductive modification of land annually generates about a million jobs and about $100 billion dollars. Yay, markets!

As the husband and friend of artists I know that there are all sorts of ways to argue about the nature and value of beauty. As the (hopefully) soon-to-be owner of a farm originally set up for ornamental livestock (horses) that we hope gradually to convert to boutique farm-to-table production, I am aware that there are no clear lines between the production and productivity of aesthetic and alimentary experience. The other thing that folks did for millenia was eat gruel. And horseflesh.

1265 panorama

But this is my point. If we take an old-school approach to productive labor, there’s very little of that left to do after the machines get done. MOST of the work that people do now, especially in the developed world, is makework. My job certainly is, in a way that’s obvious enough to produce real strains at the point of sale, and incredibly vulnerable by the productivity standards that waves of businessy types periodically try to enforce on it. For education (employment: 8 million+) in anything that required real productivity, tech schools and apprenticeships, largely taught by mechanical reproduction, would surely do the trick. The rest is landscaping.

And therefore, makework had better be alright. As much as I’d like to get on my high horse about Greek (or Italian) ‘corruption’, there’s none of my life that doesn’t participate in the same dynamics. I try to pay off, maybe in ways public sector employees give up on or never learn, but given the spread of outcomes that’s not much more than noise in the signal. What does Germany think it’s doing that’s so much better than what the Greeks are doing? For the life of me, I can’t work that out.

sisyphus

July 24, 2014

Snowpiercer

by Carl Dyke

Saw an interesting movie last night, “Snowpiercer.” Based on a graphic novel, I gather. The premise is that in response to global warming, the governments of the world leap into action and seed the skies with a chemical meant to bring temperatures down. It does, there’s a catastrophic ice age, and all life on Earth is extinguished. Except for one special train, the work of a visionary inventor, that travels a continuous loop around the world with the few remaining humans, some fish and bugs and whatnot aboard.

The humans are segregated on the train by their conditions of boarding, from first class up front through non-paying refugees in the rear. The plot is driven by the revolt of ‘steerage’, so to speak. There’s a sort of Ayn Randian quality to the basic setup – in the distrust of goverment, of course, but also in that the tail sections in fact contribute very little to the functioning of the train (beyond the odd child of the correct height to tend the innards of the engine) and owe their entire existence to the charity of Wilford, the visionary industrialist and engineer. Consequently, the ethics of sympathy for the poor downtrodden are more Kantian, a la categorical imperative, than Marxist, a la exploitation and alienation. We then go back to Rand to admire the effective gumption of that one leader and his few talented confederates who organize the (incredibly violent) breakout. None of this is articulated with any great care.

What is articulated with great care, notably by Tilda Swinton in a magnificent performance as Wilford’s top henchwoman, is an ideology of sustainability based on rigorous ordering of a closed system. Over and over the rulers explain that the whole can thrive only if each part keeps its place in exactly calibrated balance. It’s a fabulous caricature of vulgar sustainability discourse, and pokes ruthlessly at the fascism that’s never too far away when urgent images of righteous living in relation to existential threat are about.

February 13, 2014

Hall, Gramsci, hegemony, complexity

by Carl Dyke

I just had what might have been a good moment on the Facebooks. Jim Livingstone posted on how the New York Times hasn’t gotten around to officially noticing the death of Stuart Hall yet (neither had Dead Voles, until now), and in that context I wrote this:

It’s interesting to me how Hall embodied the thesis [“the ‘dispersal of power’ from state to society, ca. 1870-1930, as Gramsci tracked and projected it in the Notebooks (trans., pp. 210-76), thereby explaining why a ‘war of position’ now superseded a ‘war of maneuver’. In effect, a brilliant manifesto for cultural politics,” Jim Livingstone]. He basically WAS Gramsci: layers of marginality radicalized by immersion in the center. But where for Gramsci the hot revolution still looked like a plannable endgame, for Hall it was off the table right from the start, precisely because of that decentering of power. But – given the catastrophes of communist centralism, I think it’s fair to wonder if power has ever not been decentralized, really, so that the whole hegemony thesis ends up looking like a really rough draft of an actual theory of complex systems.

Seconds later, I noticed that here at last was a handle that made me actually want to pick my old Gramsci dissertation / book back up. Until now, other than posting the most recent version here online, I’ve abandoned it to the gnawing of the rats, because I couldn’t figure out how it was anything but yet another idiosyncratic take on well-worn materials. I didn’t have to publish it anyway to get tenure, so I didn’t. Aren’t there enough of those books cluttering up the shelves?

But there’s this thread of analysis in the piece that I always quite liked, and didn’t really know what to do with. I argue that the theorists of the early 20th century really weren’t equipped to cope with the actual complexity of the world, and so they resorted to what I called ‘space maintainers’, sort of folded up theoretical napkins under the short empirical table legs. Constructs that weren’t nearly constructive enough. Gramsci’s theory of hegemony then looks like an attempt to actually theorize complexity rather than shortcutting it somehow. Still, not surprisingly, very shortcutty and so not a good candidate for adoption here and now, but in context quite the thing.

So in that Facebook comment on Hall and Gramsci I haven’t actually said anything new to me; I’m still gnawing on the same bone I always was. But what’s changed is how much I know about following theories of complexity, and how they’ve gradually begun to inform the human studies. All of our discussions on Deacon, Juarrero and so on, for example. Which means I’m now in a much better position to frame the Gramscian / Weberian / Durkheimian moment in the history of theories of complexity, for example by seeing Hall as what Gramsci looks like in a different moment of the intellectual-evolutionary process.

And since this feels like it was my insight and agenda all along, just come into a more satisfying unfolding, I don’t have the uncomfortable feeling I always had when I was trying to think of some way to graft something more interesting onto the stuff I know. Plus, the stuff I get to read to come up to publication speed on this version of the project, and the way I get to read it, actually feels interesting and valuable in its own right, and not just a bunch of legitimacy hoops to jump through.

All of which means I actually have a clear reason to apply for a sabbatical, which is long overdue. So now we get to see if this is a passing enthusiasm, or a project that actually has legs. Cheers!

May 19, 2013

Survival of the fit enough

by Carl Dyke

In my perusings I just came across this interesting item:

Michael Vick says new Eagles coach Chip Kelly “taught” him how to properly hold the football while running. The 10-year veteran was apparently being serious. “The other day, I broke out in the pocket, and the first thing Chip told me was to tuck the football,” Vick said. “So I showed him how I was running with it, and he looked at it and he knocked the ball right out of my hands. And he was like, ‘Hold it like this.’ And what he told me felt comfortable. I had a tighter grip on the football. That should secure that problem as long as I work on it.” It’s beyond belief that Vick is implying that he not only didn’t know how to properly hold the football, but had never been taught by Dan Reeves, Jim Mora Jr. or Andy Reid, but here we are. Vick has lost 12 fumbles over his past 35 games, which is far too many.

How do four people – Vick, Reeves, Mora, Reid – who do a thing at the very highest level, who have pretty much done it all day every day for their whole lives, not notice there’s a basic, outcome-changing problem and take easy steps to fix it? Is that surprising?

In the book discussion over at The Long Eighteenth I’ve been trying to both discuss and, predictably, demonstrate this effect. Gikandi looks at the slavery / culture of taste complex; sees the one is both enabling and constraining the other; and apparently can’t think of any way that could make sense other than grand psychic defense mechanisms like repression and libidinal sublimation. Big effects must have big causes. Has Vick been repressing a desire to lose this whole time? Or did he just carry the ball a way, mostly not drop the thing, and therefore never think or feel much about it? After all, fumbling’s part of the game.

Do analyses like Gikandi’s repress a dark terror of the mindless operations of unreflective habit? We all get to have our favorite theories, but jobs go smoother if you use the right tools. In a book in large part about the history of the judgment of taste, with a 30-page bibliography, Gikandi mentions “French anthropologist” Pierre Bourdieu just one time, as having called “a set of socially acquired dispositions and predispositions” habitus once (218). Habitus, a concept more pertinently developed in Bourdieu’s Distinction: A Social Critique of the Judgement of Taste, is the new grip that would have fixed some of Gikandi’s fumbling; but like Vick, Reeves, Mora, and Reid; Hume and Jefferson and the Beckfords; teachers, students and administrators, he’s been getting along well enough without it.

May 10, 2013

What counts as success

by Carl Dyke

Reading final papers and course journals now, this smacked me between the eyeballs. For better or worse, this is what counts as a major success to me (from an introductory world history journal, so don’t sweat the typos). Our topic this semester has been ‘conditions of work’:

The last couple weeks, in class, we have investigated the research process and our second papers. I am learning that no matter what time period we are individually studying or what country, most of the same rules apply. There will always be a certain “group” within a population that is getting miss treated because they can be. In most cases, victims are not victimized because of some racial intention or ill-will, it’s because of necessity. I think that when something needs to be done that no one else wants to do, society “volunteers” people to do it. If that group doesn’t have the power or will to object, they fill the void. Once this precedence is set, the negative connotations follow.

Is that the end of the story? No, of course not. But to me, at least, this cleans out the hero/villain juvenilia and the ideological just-so stories and gets the line of investigation pointed toward increasingly better understanding. Yay you, unnamed student.

April 1, 2013

Another one on linked learning

by Carl Dyke

Some of you may still have a shred of interest in this topic, so here below is a post I just wrote for my school’s gen ed debate blog. Again, the issue is a challenge to the plan that just passed the full faculty, by a group that want to add back more ‘liberal arts’ courses and incidentally remove the linked learning component. (Btw Dave, re: evidence I have done other posts compiling links to lots of educational research and comparable cores at other unis.)

As some of you may know, I run a tennis group up in the Cary area. I have about 100 players on my distribution list and some subset of us get together twice a week to play and socialize. I also play in USTA leagues in Cary, which puts my network in the hundreds.

Because it’s Cary, and because it’s tennis, a very large proportion of these folks are mid to high level professionals. I play with CFOs and chief accounting officers of major corporations; state legislators; small business owners; pharmaceutical executives; IT and data security professionals. We hang out after we play and talk. As a result, over the years I’ve accumulated a fairly dense ethnographic understanding of how these folks think and what they want. And because I’m a college professor, we’ve talked a whole bunch about how they think about college education and what they want from it.

It is absolutely true that, as Lloyd just said in the last post, they have abysmally low expectations of the value of a college degree. They routinely interview and hire candidates with fancy educational credentials who just as routinely turn out to be fundamentally unprepared to be useful. From ample experience, they expect college graduates to be clueless and high-maintenance. They are resigned to this fact. They hope for a little technical polish as a writer and communicator (they get even that rarely) and a general middle-class culturing, by which they mean an acceptance of the value of the enterprise and a certain amiability about following instructions. Because they don’t expect more they don’t look for more, as Lloyd said.

When I talk with my friends about a more ambitious agenda for college education, one that involves teaching students to be resourceful, independent learners who can make connections, figure things out for themselves, and adapt responsibly to complex, unfamiliar situations, they get a faraway, wistful expression. These dispositions are rare and precious to them. I was talking this weekend with a consulting engineer who works regularly with the state department of transportation and a P.A. at a major cardiology center. They bonded over the irrational outcomes that are regularly produced in infrastructure and medical care by rigid systems of rules designed to intercept bad decisionmaking and create predictability – because the people involved can’t be trusted to think their way through the variables of particular cases, and a mediocre outcome is better than a disastrous outcome.

Which brings me to general education. There is enormous value in transmitting what is already known to the young. A firm grounding in the traditions of knowledge is essential to the educated person. Such an education can do much to guard against disastrous outcomes. But as proponents of the alternative core have amply shown, exactly this grounding is the focus of the vast majority of general education programs at our peer institutions, as it has been for many, many years.

And these are the graduates my informants find so disappointing.

It may be that our students ‘should’ be able to learn a more resourceful kind of thinking from our classes, but mostly they don’t. And not just ours. And it’s for the simple reason that we don’t show them how. This is why I think the alternative proposal is out of balance – because the wonderful things in it don’t have the impact they should as long as we’re not intentionally showing the students how to put them together and make something of them. This is the college education my tennis buddies would love to see, and that they’re mostly not seeing. This is the opportunity we have now at MU with the Linked Learning initiative, which is why I think it’s short-sighted to vote it out just in case we vote it back in again later.

March 24, 2013

Wild yeast sourdough starter

by Carl Dyke

As a logical next step in my fiddlings with bread-making, I just baked my first sourdough loaf with home-made wild yeast starter the other day. To eliminate all suspense, it came out great – by which I mean, it reminded me of all the things I like about sourdough bread without introducing any new negative associations. I especially like it because I did it ‘all wrong’, which is what this post will now document.

“Softly now, softly now – try it, you won’t die.” Silkworm, “A Cockfight of Feelings

So, how I went about this is I got on the ol’ internet and googled ‘sourdough starter’. A little reading got me pretty quickly to the further qualification, ‘wild yeast’ – thus distinguishing the truly artisanal starter from the kinds someone else made that you can buy for a whole lot of money from specialty baking stores, if you’re a clueless snob, or Amazon, if you’re even more clueless but at least not a snob. So once I had the correct verbiage for cheap-ass diy starter, I did some more searching and read through some instructions. (I omit the links because I just told you how to diy, get it?)

Well, opinions about exactly what’s happening with sourdough starter seem to vary a bit, starting with where the wild yeasts are actually coming from. Is it the air around us? Is it the flour? Is it the whole grains you must treat with excruciatingly careful reverence to yield Gaia’s bounty of biomagic? With just a slight knowledge of these matters, I decided it was probably all of the above, plus everywhere else, since that’s where yeasts are. So I ignored the instructions that said I had to be careful not to cover the starter vessel with plastic wrap or anything else impermeable. I also ignored the instructions that said I had to hermetically seal the starter vessel, sterilize every instrument that ever came in contact with the starter, wear a hazmat suit, never use stainless steel, always use stainless steel, never use silicon, always use silicon, and so on.

Go Green!

Go Green!

In fact I pretty much ignored every single instruction designed to seal off the wild yeast starter from the environment it had somehow come from. I also ignored all the instructions designed to make my starter a delicate, difficult thing that required constant, meticulous care. I know people whose lives are given a rich sense of meaning by arranging to provide constant, meticulous care to other creatures, but that’s not me and if it was, I’d pick creatures other than yeasts and lactobacilli.

Speaking of lactobacilli, I paid a lot of attention to discussions of the multi-biotic nature of sourdough starter. It’s not the yeasts that are making the sour, it’s the bacteria. But the bacteria don’t make the bread rise, and they also have a tendency to make the ‘spoilt’ version of sour when they get lonely and pig out. So a functional sourdough starter is actually a community of beasties each creating some of the conditions for each others’ happiness, encouraging each others’ strengths and discouraging each others’ excesses, and incidentally each handling part of a fairly complex little biological process that assembles into a tangy leavening. Which of course wasn’t at all what they ‘intended’, but makes an excellent complement to garlicky cream cheese. So anyway, ‘building’ a starter is a process of getting that community together to work out a harmonious relationship under the conditions they enjoy.

“Control is when others’ locked-in interactions generate a flow of collective behavior that just happens to serve one’s interests.” Padgett and Ansell, “Robust Action and the Rise of the Medici, 1400-1434;” see also Padgett and Powell, The Emergence of Organizations and Markets (2012).

Those conditions are: flour and water. We’re talking about fermentation here, after all, which in real life is hard to keep from happening if you’ve got moist sugars around. Which brings up the mold problem, of which there’s plenty in my house, the dominant strain for unmysterious reasons being ‘bleu cheese’. But fortunately, between the acid the bacteria start producing right away, the alcohol the yeasts start producing soon enough, and the natural division of labor among the artistes of organic decomposition, mold is not actually much of a threat if you’re not trying hard to kill the yeast and bacteria somehow.

Mmmmmmm, stinky.

OK, so I read a whole lot about ambient temperature, water temperature, using bottled water, using distilled water and adding minerals back in, using orange juice, using pineapple juice, using white flour, using rye flour, not using white flour, not using rye flour. With just a slight knowledge of these matters, I reflected on the global success under the most extreme conditions of yeasts and lactobacilli, and decided not to sweat any of these factors too much (although, in principle, I wouldn’t have been completely surprised if a chlorine spike in my suburban tap water had set the critters back a bit). I did decide to take some of the chance out of the lactobacilli, mostly because I had an old tub of plain yogurt handy. And no, it was not any particular brand or type of plain yogurt, but it was past its expiration date as it happens.

I also looked at a lot of instructions about getting a kitchen scale, getting one that measures in grams because they’re more precise, calibrating hydration ratios, using a tall, straight-sided vessel with a dedicated lid, sterilizing this vessel and your hands before handling it, scraping down the sides so that, gosh, I don’t know. So anyway, here was my beginning recipe for my wild yeast sourdough starter:

Some flour
Some water
Some plain yogurt.

Roughly the same amount of each, by eyeball, probably a bit less yogurt because I thought of that as a ‘supplement’.

“My friends always say, the right amount’s fine. Lazy people make rules.” Silkworm, “A Cockfight of Feelings”

All of this went in a plastic bowl (with sloped sides because it has sloped sides) I also eat cereal, pasta, and curry from sometimes; with some plastic wrap loosely draped on top to keep it from drying out too fast. This then went on a corner of the kitchen table I wasn’t using for anything else right then. I am woefully ignorant of the exact temperature of this spot, but I can guarantee it was neither hot enough to bake nor cold enough to freeze my arse. I started with bread flour, I think, but I ran out of that before the next feeding so I switched to rye for awhile because I had a bag of that open and it kept getting mentioned in the instructions. Then for awhile what I had open and easy to get at was some white whole wheat flour, so I used that.

And speaking of feeding, I read all kinds of instructions about pouring out exactly [some ratio I forget] of the starter before each feeding, adding back [another exact ratio I forget] of flour and water, doing this once a day at first and then every 12 hours, carefully swabbing down the sides of the container, adding strips of tape to allow precise measurement of the starter’s expansions and contractions, holding the container between your knees and counting to 6,327 by perfect squares, and checking carefully for ‘hooch’, which is such a precise technical term that at least half of the folks using it have no idea it’s why there’s NASCAR.

Medicinal purposes only, of course.

What I did instead was pour some out and add some back, roughly the amount it had expanded in the interim; when I remembered it, which was anything from a couple times a day to every couple of days. I tried to keep it pretty soupy because I read the beasties like to be wet, and I’ve found this to be true. I did this for something between a week and two weeks – I did not keep track. About day 2 or 3 it got that sourdough smell, then it settled into a kind of sweet peachiness I had not expected. I got back onto the internet and found a long forum thread on the many, many different permutations of ‘sweet peachy’ smell ranging all the way to ‘spiced apple’ that can be expected from a properly harmonizing community of yeasts and bacteria. Reassuring. So when I got sick of waiting any longer, although I think I was supposed to, instead of pouring out the extra I poured it into a bowlful of the flour I happened to have handy and open right then. Whole wheat, rye, and kamut as I recall – kamut btw is fun stuff, an heirloom grain that has a lovely buttery flavor and adds amazing elasticity to a dough.

Here was the ‘recipe’: salt in the right amount for the flour, bit of sugar to be friendly, touch of olive oil and enough warm (tap) water to make a wet dough just drier than a batter. Because the beasties like to be wet. Once they’d fermented that up for most of a day, I stretched, folded, smeared, punched and kneaded in enough more flour that it would stay in a loaf shape (not doing this is how you get ciabatta); let it think about that for maybe an hour longer; threw it in a hot oven on the pizza stone; dumped some water in the bottom of the oven to get some steam to keep the crust from setting too quickly (thank you internet); and some time later there was delicious whole wheat / rye / kamut multigrain sourdough bread.

IMG_20130321_220153

Through all this I was aware that by failing to control for every possible variable the project could go horribly awry rather than pleasantly a rye. I reflected on the $.50 of flour and aggregate 10 minutes of work that would be irretrievably lost, and decided to roll those dice.

Does this mean none of the variables all that internet fussing is trying tightly to control don’t matter? On the contrary, I’m sure they do. But my little experiment suggests most of them other than flour, water, a container, and temperatures somewhere between freezing and baking are conditions of the ‘inus’ variety:

“The inus condition is an insufficient but non–redundant part of an unnecessary but sufficient condition” [quoting Cartwright, Nature’s Capacities and their Measurement, 1989, citing Mackie, The Cement of the Universe, 1980]. It’s best to read that backwards: you identify causal conditions sufficient to produce a given effect, but know that there are other conditions that could have produced the same effect. Within the sufficient conditions you’ve identified is a condition that couldn’t produce the effect by itself, is separate from all the other conditions that along with it could produce the effect, but must be among them for the effect to be produced through the causal pathway that’s been picked out. The inus scenario (any scenario containing an inus condition) shows up frequently in attempted causal analyses, and has to be accounted for somehow in any comprehensive causal theory (Chuck Dyke aka Dyke the Elder, “Cartwright, Capacities, and Causes: Approaching Complexity in Evolving Economies,” draft-in-progress).

There are lots of ways to skin a cat. Which means there’s an interesting sociology of popular science lurking in the internet’s various treatments of wild yeast sourdough starter. There are many strategies on offer, each presenting a series of essential steps to success. And each of the strategies will in fact result in a successful culture, while adding procedures that may be important only to offset the sabotage added by other procedures, or to create an outcome distinguished only by the specific way it was achieved; or not important at all except for attention focus or ritual (which, by the way, are not trivial considerations). Apparently when a thing happens to work one way, we can be inclined to leap to the conclusion that this is the one best way to make it happen; ignoring all evidence to the contrary, for example all the other ways described in their own loving detail by other practitioners just as convinced of the robust essence of their accidental triumphs.

Incidentally, this is also how I think about education in general, and general education in particular.

March 11, 2013

We’re all moocs now

by Carl Dyke

I am excited to discover a startling technology that will change how we teach, learn, and even think! This technology efficiently stores the accumulated knowledge of our most expert minds. It is easy to access with skills a child can master; combines visual, auditory, tactile, and even olfactory stimuli to activate any learning style; can be enhanced with images, charts, graphs, and other media; and can be shared by one or many at times of their own choosing.

Yes, believe it or not this technology makes the entire treasury of human knowledge available to everyone at virtually no cost! Just a small fee to compensate the material and intellectual labor of its producers; or with sufficient public demand and institutional support, no cost to end-users at all. And because of its low cost and ease of access, this technology encourages new knowledge and new knowers at a historically unprecedented rate and intensity.

Perhaps best of all, this technology is many times more efficient than lecture for information transfer. It will therefore allow us to ‘flip’ our classrooms, liberating teachers and students from the drudgery of rote learning, moving content acquisition to home self-study, and freeing up class time for discussion and reflective integration.

This revolutionary technology is called ‘books’.

What’s my point? We’ve been in the technological new regime for over 500 years. Mass information storage and availability has not been the issue for a very long while, although the new digital media are tremendous conveniences. My point is that it’s downright bizarre we’re still treating lecture like a respectable teaching strategy and flailing about trendy new alternatives to it. My point is that as long as we treat oral transmission as the teaching / learning default, we are culturally pre-literate. My point is that it’s long, long past time we could be doing much, much better. Click through to the links for more on how to notice and think our way out of this trap, thanks to the physicists.

My point, finally, is that the problem with moocs is not that they overthrow the great traditions of teaching and learning. The problem with moocs is that yet again, they don’t.

Where all the windmills at?

Where all the windmills at?

Well, any cultural system that so clearly works against its own manifest opportunities and interests for so long must be accomplishing something else(s) important. Any thoughts about what?