Top
Tuesday
Aug062013

Public policy idea: Rename all of Chicago "Hyde Park," end all crime.

While I've been living in Hyde Park for only a few days, I have spent a lot of hours over the past several years reading maps of Chicago's Community Areas and neighborhoods. As friends scattered from Evanston and around Chicago years ago, I got to learn more about Chicago's neighborhoods, both the real and the ones invented by realtors (see, e.g., Bucktown and its environs and most anything with "west" in its name). But something I've only recently noticed is how so much of the South Side is now apparently "Hyde Park." This week I've heard places as far north as 31st St. referred to as Hyde Park, places as far south as the 80s (!) and west to the highway are now also apparently part of the magical, ever-expanding Hyde Park neighborhood. Most interestingly, all of these references were approving ones, praising the virtues of so-called Hyde Park, even when my interlocutors were referring to Bronzeville or Woodlawn or Kenwood or whatnot. 

What the heck is going on here? A friend suggested that, to get white people to visit, you have to refer to anything north of 90th St, east of the highway, and south of Roosevelt St. as "Hyde Park." If even our President lived in Hyde Park, it can't be a bad place to be, right?! (Never mind that the Obamas' house is actually on the southern edge of Kenwood and not in Hyde Park.) The power of the Hyde Park name isn't surprising given the Obama and UChicago connections. And Hyde Park's reputation as a bubble on the South Side is deserved. But that the name alone, when applied to sundry South Side street corners, can so effectively transport literal South Side locations out of the lore of the "South Side" and into places (white) people would want to visit has been, I'll admit, a strange thing to witness. 

Friday
Jul052013

First-generation college students don't graduate.

About 90 percent of first-generation college students who enroll in university don't graduate within six years. These students are the ones who've already showed the tenacity and intellect necessary to apply to, get accepted into, and enroll in a four year university, even with the extra hurdles of being first generation and often under-resourced. They are not dumber or lazier than their middle-class brethren, who graduate from four year programs at a rate closer to 60 percent nationally. Money, of course, is an important differentiator and cause for the graduation rate difference. But social capital and a network who understands the value of college is also insanely important. 

Before I started college I'd already attended two academic summer camps at the same university and lived in a college dorm for a month each time. My family understood the value of college and actively encouraged me to apply to the best schools, no matter how many miles away they were or how much their sticker price was. When I started sifting through brochures with lists of majors, imaging my future as a linguist or statistician or account or whatever crazy ambition I had, my family and friends were able to engage me in substantive conversations about academic study. Of course many first-generation college students have families who understand the value of university and actively encourage their students to aim as high as they can. But others may not see the added value of a four-year university three states away over the community college next door. 

All of which is why I'm insanely excited about starting my role as a volunteer mentor to an under-resourced student from Chicago who'll be attending college (an east coast Ivy League) this autumn. There is an insane amount of low-hanging potential that America is ignoring. Students who have already managed to achieve so much and enroll in university are already most of the way to becoming college graduates, higher earners, role models to their communities, and model citizens in the greater United States. As deep as these students may put their nose in their books, there are still unwritten rules of college engagement they won't know and support they can't get from traditional sources. Mentoring can give students the emotional support they need to know that college is a worthwhile investment. At the price of free labor from volunteer mentors and a small paid organizing staff, it'd be crazy not to give it a try.

Monday
Jun172013

Adjuncts and the tenure track in sixteenth century Europe.

When academia chatters about itself today, it mostly complains about the death of the tenure track position and universities' increased reliance on contract adjuncts with high teaching loads and no time for research. The subtext, if one believes all this eulogizing of academia, is that there once was a golden age when professors were all well-paid, when scholarship was highly respected, when research dollars were plentiful and innovation unencumbered. 

It's too easy to point out the silliness in the parochial celebration of academia's alleged prior glory: which is worse, not allowing women or minority professors, or paying professors too little? It's not a useful question. 

So I was amused to read this passage yesterday about scholarship in the time of Martin Luther, Erasmus, and John Calvin. Apparently even in the sixteenth century scholars were complaining about having to teach too much, getting paid too little, and earning no respect:

The spotlight of historical study has been focused on the greatest of the new scholars, those who have been the first beneficiaries of this study and who attracted the most attention from princes. Remaining in the shadow are all those who made up the rank-and-file of the movements, the minor scholars whose work brought them only a poor living and who might bask for many years in their pride at having approached Erasmus during of his journeys, or met Colet or Fisher at Oxford, or received a letter from Bude or Gaguin. Those who kept up the continuous activity of these associations without financial endowments or paying pupils experienced anxiety and anguish that were all the harder to bear because books were expensive and they could not carry on their work without frequently purchasing them. If, in order to live, they became printing workers, for years at a time, they lost contact with the circles of new learning. More commonly, they preferred to teach, outside the traditional framework, as needy tutors instructing little groups of children in Greek and Latin: they were poorly paid, and persecuted by Church authorities when their illicit activities became known. It was for them that Philip Melanchthon wrote his De miseriis paedagogorum, in which he described the woes of these penniless scholars vegetating in petty teaching posts. 

[the book is called From Humanism to Science 1480 - 1700 and it's by Robert Mandrou]

Tuesday
Mar122013

Quality of a life.

Erin Callan, erstwhile CFO of the now-defunct Lehman Brothers investment bank, contributed a salvo to the "work-life balance" debate a few days ago in a New York Times op-ed. Apparently she worked on Sundays and ate her meals at her desk and ignored her husband and friends — her work was her only priority, as she tells it. And now she wishes she hadn't done all that work. Or something. A thousand-word op-ed is by nature limited, but I'm not really sure what her point is, or what she thinks she's contributing to the conversation about work-life balance. She's 47, once-divorced and now undergoing IVF to try and have a child with her second husband. She wishes this weren't the case. She writes:

Sometimes young women tell me they admire what I’ve done. As they see it, I worked hard for 20 years and can now spend the next 20 focused on other things. But that is not balance. I do not wish that for anyone. Even at the best times in my career, I was never deluded into thinking I had achieved any sort of rational allocation between my life at work and my life outside.

Callan may regret her decisions now; she may even regret them because it's now apparently fashionable for women to lament or to some extent disown their younger self who sacrificed for her career to the exclusion of everything else. But I think it's disingenuous of Callan to say that her decision to work so many hours wasn't "a rational allocation between my life at work and my life outside". It wasn't balance, sure, but that doesn't mean she didn't find it rational, or that other women or men who are not Erin Callan would be wrong in making the choices she did. She's an intelligent and ambitious woman, she could've found a less demanding job. It would've come with lower pay and less prestige, of course. But no one is entitled to unlimited prestige, millions of dollars in salary and bonuses, plus whatever "full" outside life he or she desires. 

What bothers me about these "can't-have-it-all" essays is that they seem to be coming predominantly from highly educated, upper-middle or upper-class, privileged, straight white women. These are not the women I'm worried about. I don't care that Erin Callan had to check her BlackBerry on Sunday or that she flew overnight to Europe for a meeting (oh, but on her birthday! How devestating). She got compensated handsomely for her time.

The women and men I'm worried about are the ones working eighty-hour weeks at jobs where working only one job wouldn't give them a living wage. I'm worried about the women and men who've been priced out of their last three neighborhoods to make room for luxury condos and who now must commute an hour or two each way into the city for work, costing them the cost of a car, gas, insurance, and time. These are the women and men who, although they've sacrificed everything else in their life for their jobs, don't get rewarded with prestige, several-million-dollar salary, and an op-ed in the New York Times

To Callan's credit, the op-ed doesn't solicit our pity (very much) for how hard her life must have been — she made her choices, it's just that now she regrets them or sees them to have been unnecessarily extreme. Fine. Callan has the right to her opinion and to express her regrets at her leisure. I'm not sure the New York Times op-ed page is the right venue for such a thing, but that's more a reflection of what people want to read (or what the NYT op-ed page editors think people want to read). I'd like to hear from the women who didn't get a choice, though. 

Monday
Mar112013

Beckett and the Project for Existence 

It is an odd project to undertake, to wonder whether one exists. Once that’s been established – if it ever can be  – it is even more disconcerting to explore the terrain just beyond where one ends and where others, if they exist at all, begin. Ontology remains an exercise for the humanities, attempted often by philosophers and less frequently by biologists1. No extant technology can demarcate a bright line between brain and mind, leaving the question of existence (ironically or appropriately enough, depending on your sense of humor) simply a cerebral exercise2

For Renee Descartes, thinking was sufficient: If humans are sentient beings, then coming to that realization through cognition proves existence. Exploring the terrain beyond the constricting “I” and “I think”, the void between I and the Other, Descartes left for others. Stephen Dedalus, traipsing along Sandymount Strand in James Joyce's noonday sun, took two steps past Descartes, in the direction of disproving solipsism. First, Dedalus saw things — the nearing tide, a rusty boot — and, unable to escape vision, figured, well, the things he saw must exist (at least in the mind). But how to know whether they exist outside the mind? Dedalus recalled that Aristotle, by knocking a sconce (candleholder) against a body, knew it existed. And there's the second step in escaping solipsism: If you kick a rock, you'll stub your toe, and so the “I” must have a border, a container that’s damaged when it collides too forcefully into an Other.

Joyce’s contemporary Samuel Beckett demurred, rejecting Descartes, Dedalus, and Aristotle. Beckett wouldn't take the “I” for granted and, in Texts for Nothing, depicted a mind (maybe) kicking and thrashing against its own thoughts, occasionally coherent, dubiously aware, but existing (again, maybe). For the uninitiated, it’s hard to give a satisfying summary of Nothing, whose effect depends on its having no narrative to summarize (those wary of the postmodern project, look elsewhere).

Nothing is 13 untitled fragments; asking whether the narrator is the same throughout the whole would be missing the point. From text #8: 

“Well I’m going to tell myself something (if I’m able), pregnant I hope with promise for the future, namely that I begin to have no very clear recollection of how things were before (I was!), and by before I mean elsewhere, time has turned into space and there will be no more time, till I get out of here.”

From #8 again:

“But that other who is me, blind and deaf and mute, because of whom I’m here, in this black silence, helpless to move or accept this voice as mine, it’s as him I must disguise myself till I die, for him in the meantime do my best not to live, in this pseudo-sepulture claiming to be his.”

The fragments of Nothing eschew enough rules of English syntax to leave the reader grasping for a foothold, narrative or otherwise. Texts for Nothing is only a work of fiction in English because it is packaged as such: the words are English; the punctuation is English; the spacing and page layout are in keeping with English prose conventions and its publisher calls it prose. But what is sensible, what the eyes can see on the page, is illusory: the text gives the illusion of being a short story in English, but it lacks the internal requirements of characters, actions, temporality.

Individual sentences, if their “I” and “you” and “him” refer to corporeal subjects, might be comprehensible. But there’s no evidence, within the limited confines of the text, to support a belief that I or you or any voice I’m hearing exist.  Does implied temporality — here, a monologue from a narrator who appears to exist in time — imply materiality? These thoughts we’re reading are written down, take up physical space and ink and paper, and survive the writer and any reader. Is this enough to prove the mind’s existence?

When Descartes wrote or spoke, “I think, therefore I am,” by vocalizing his existence through words, he still had to rely on the physical, transferable (from mind to paper or voice) nature of words to make his point. Beckett attempts to annihilate this dependency by forcefully containing words to the mind, disintegrating the physical body. But he can’t annihilate language. What we’re left with is text, and text can’t speak; the “I” of Nothing is only a cruel irony, not a presence. Nothing is not an answer to the mind/body dilemma; it is a clever nothing that apes existence, for the purpose of nothing. It is Beckett at his best.

I first read Nothing many years ago, but was moved to revisit it when a friend was recently diagnosed with depression. The diagnosis came as a surprise to him. He wakes daily to his body, exists at any moment only with the help of his brain. That his depression — rooted in his brain, if the materialists are to be believed — eluded his notice for so long is a cruelty of the mind/body regime. Beckett’s Nothing is a mind that is yet somehow something less than a mind (because a text unburdened by a conscience cannot “speak” in the way a person might). But depression is an artifact of a mind that is somehow more than a mind where, as best as I can understand it as a non-clinician and non-sufferer3, it appears to the victim that he has a mind he can control and another, more sinister, mind that he cannot. When my friend told me of his diagnosis, I thought of Beckett, of Nothing, because, as a representation of mind, it is the perfect fraud. And I worried for my friend, who now worries that the mind he’d come to depend on to build a life was an ersatz mind, not worth its weight in brain matter.   

 

[1] The classical provenance of the mind/body problem was, of course, scientific, but during an age where philosophy and science were one discipline. Today, most scientists are materialists, holding that everything in the mind has its analogue in the materials of the brain, even if we don’t yet understand fully how the brain works.

[2] Even Daniel Dennett, one of the most well-known contemporary theorists of mind, holds his doctorate in philosophy, not neuroscience. I must, however, at this point beg ignorance of much of neuroscience.

[3] And where, to be honest, most of my knowledge comes from William Styron’s Darnkess Visible, because of course.