Fordham GSAS: Grad. Life: February 2011

Monday, February 28, 2011

E-Books, Part II

There’s been an article going around on Facebook lately. Mostly, I’ve seen it being reposted by other English grad. students. But I think it really applies to all of us in one way or another. The article is about marginalia, and about how the digitization of books and the move to electronic texts means losing this precious gem.

Scholars, and even grad. students such as myself, have spent a lot of time discussing the pros and cons of e-books. The cost. The emotional attachment. The issues with varying editions. Even the inability to take notes on e-readers. Now academics, both in the university system and without, are realizing how much more we’re losing, and trying to find a solution.

A recent Chronicle article discussed the problems not only students, but scholars are having using e-readers. If a scholar is trying to publish an article and is citing a text she has on a Kindle,for example, then…how does she cite it? On the Kindle, there are no page numbers. Instead, Amazon assigns specific passages “location numbers.” On e-readers like Sony’s where you can change font sizes, things get even more complicated. There have been suggestions that this issue can be solved by numbering each paragraph of a text, for example. But that hardly seems like a perfect, or even desirable, solution.
The other big problem goes back to one I discussed in my previous post: the inability to annotate an e-book. If we can’t annotate e-books, we’ll be losing not only our own ability to take notes and connect with the text in that (very deep and, for some people, very necessary way), but the amazing legacies that authors and other famous figures often leave us through their marginalia. If Mark Twain had only read e-books, we never would have known what he thought of all the books he read! But, because he was able to write in his simple, primitive, antiquated paper books (which are totally wireless!), we have been able to learn a lot about him and his time period through his own voice.

Is this kind of information really worth losing? This isn’t just about emotional attachment anymore. This is the potential loss of real facts and historical information we’re talking about here. And, even if e-readers that support writing are developed, there seems to be much less of a guarantee that note-taking on an e-book will be as prevalent as on a regular book…or at least not as commonly kept. Even if people do start creating digital marginalia, what are the chances it will survive after their death? It’s so much easier to erase type than it is even a pencil mark in a book. And so much easier to drag something into the digital trashcan than it is to throw something you’ve taken to bed with you into the fire and watch it burn.

Monday, February 21, 2011


Image: Alexandra Loizzo’s zombatar. Create your own here

As you all probably know, zombies are in right now. And I mean really in. Beyond Michael Jackson’s Thriller kind of in.

Even if you haven’t watched The Living Dead or read World War Z (which I haven’t), I'm pretty sure you’ve at least heard of Jane Austen’s “newest” novel, the wildly popular Pride and Prejudice and Zombies. Personally, I know all the words to the Plants vs. Zombies theme song. But that’s about as far as I’ve gotten into this new trend. Well, that and taking the quiz on whether or not I will make it through the Zombie Apocalypse, scheduled for 12/22/2012 (FYI: I will survive!).

In discussing this new fascination for everything undead, I’ve heard several people say that these supernatural fads, whether they be for zombies, vampires, wizards, or anything else, come in cycles. Or, if you prefer a different metaphor, in waves. The interesting thing about this new zombie obsession though, is that the mythology around the creatures seems to have changed.

What Wikipedia terms the “original zombies” were connected to voodoo. This version of zombie lore was popular at least through the 1940’s, and you can see a representation of it in the 1943 film I Walked With a Zombie (postcolonial theorists shouldn’t have any trouble constructing a fruitful analysis even from just the trailer of this film).

Of course, being an old woman inside, this was the myth I was familiar with. At first, I thought the Plants vs. Zombies zombies were weird in their fascination with brains (AKA “braaaaaaaaaains”) and eating flesh. But, as I found out this past Thanksgiving, apparently this is not the case. This whole brain-eating thing is just part of the new construction of zombies. Generally, zombies are no longer people under a powerful voodoo mind-control spell. They are, as Wikipedia so aptly describes it, “victims of a fictional pandemic illness causing the dead to reanimate or the living to behave this way [as zombies].”

This new incarnation of zombies, perhaps surprisingly, has also leant itself to academic analysis. Now that zombies are seen as a potential health threat and not just creations of some strange form of magic (most similar to the imperius curse in Harry Potter), academics can analyze what our fears of zombies really mean. We can think about what a response to a zombie apocalypse would look like and compare it to how we should respond to other cataclysmic events. We can analyze ourselves, our societies, and our politics through this pop-cultural fiction.

In his recent book Theories of International Politics and Zombies, Daniel W. Drezner (also author of the Chronicle article that partly inspired this post) says that he “realized that zombies are a great synecdoche for a constellation of emerging threats […] I looked at the literature on ‘zombielike events,’ calamities akin to an army of reanimated, ravenous corpses. This meant researching the sociology of panic, the political economy of natural disasters, and the ways in which past epidemics have affected world politics. […] Applying international-relations theory to a zombie-infested world was a way of affectionately but satirically tweaking the field's strictures.”

I don’t know if this method would work for every field or for every scholar, but I think Drezner’s decision to take something that is currently so ingrained in our culture and use it as a lens through which to analyze his subject’s concerns and current or pending real-world problems is brilliant. These kinds of books may be somewhat satirical, but satire is also engaging (I mean, hey, if it was good enough for Jonathan Swift it should be good enough for us, right?). And using such a broadly-known topic as a starting point allows academics to draw more people into the conversation. Maybe now it won’t just be the author’s colleagues who read the book; everyday people (AKA zombie enthusiasts) might pick up the book for different reasons and learn just as much.

I’m sure there are other books like this out there, and perhaps we just don’t hear of them as often, but maybe that’s because it’s not as normal. Or accepted. Or even encouraged. Personally, I’d love to see more people do this. And not just with zombies. Though I do think a book analyzing the shift in zombie lore would be fascinating.

Monday, February 14, 2011

Valentine's Day...the Modern Way


“The digital classroom.” Most people have probably heard this phrase before, but what does it even mean? I guess a lot of people think of those commercials for The University of Phoenix Online. But can’t programs like BlackBoard, which are meant to make learning in the physical classroom easier, become “digital classrooms” of their own?
That’s certainly what Sharon Marshall seems to think. What I appreciated about Marshall’s Chronicle article though, is that, though she expresses her concerns regarding the over-use of programs like BlackBoard, she doesn’t go totally anti-digital either. She’s just trying to find a balance between using technology to make things easier for students and really having engaging face-to-face discussions in the classroom.
I’d be interested to know whether the best balance changes depending on the discipline, but personally I like it best when professors use BlackBoard as a reference site. I’m not too fond of public response papers; although I’ve enjoyed them in the past, I think it puts a lot of unnecessary pressure (time pressure, peer pressure, ideas pressure…so many kinds of pressure!) on people. Especially people who aren’t very confident writers. I like it best when BlackBoard is a place where I can go to find the syllabus, the readings…all those paper things you might lose or that you’d have to go searching for yourself otherwise (as if it doesn’t already take enough time just printing everything you have to read each week!).
So, despite the fact that I’m a blogger, I’m obviously not as into technology as some other people (as you guys have probably noticed by my previous post on digital books and e-readers). Which is why this new invention that I stumbled upon has got me a little freaked out. Maybe it’s because I recently became obsessed with the reimagined Battlestar Galactica series and I’m afraid this is the first step on the road to cylons…but doesn’t the fact that you can now rent a robot to go to your meetings for you freak anyone else out? I mean, just a few months ago this was a joke! It was something crazy Sheldon made up on Big Bang Theory, not a real thing. If we can go to meetings using a robot, why can’t we go to class that way? Or why can’t you take your long distance boyfriend of girlfriend out on a Valentine’s Day date via robo-cam? And have you noticed how cute the robot’s eyes are? We’re going to end up like the people in Wall-E if we’re not careful.
Anyway, if we’re trying to strike a balance in the “digital classroom,” maybe we should be doing the same in the “digital workplace.” Why is VideoChat suddenly not good enough for attending meetings? Is a little plastic head on a stick really that much better? I think it’s just a little creepier.
But maybe it’s just me. Maybe I’m secretly from another generation (sometimes I do feel like I’m a 50-year-old woman on the inside). Maybe everyone’s personal technological balance is different. I’m just not sure I’d want to be in class…or on a date…or in a meeting…with a robot. Somehow, I’d even rather bring my laptop and see your face on that screen.
Well, Happy Valentine’s Day, everyone! Let's all try to enjoy one of the holidays that doesn’t fall during December madness.

Monday, February 7, 2011

The Readers

And no, I’m not referring to the movie (or even to Hugh Jackman’s fantastic non-interpretation of the movie). I’m referring to graduate students.
I’m pretty sure that all graduate students, no matter what their discipline, could be considered professional readers. It doesn’t matter if the other part of your grad. school life is taken up by labs, by problem sets, by research, by writing, by traveling, or by anything else. Most of us are probably reading more than we ever thought it was possible to read in a week. I think I calculated that I was reading about 1,000 pages a week last semester…and that was just in assigned texts. What about all those Chronicle articles?
Speaking of Chronicle articles (see how I got that in there, guys?), this one I stumbled upon last week is about exactly that: graduate students and reading. Even since I started undergrad., I’ve heard a lot of advice about how to read effectively as a student. The advice I’ve gotten most often has been some version of this: “It’s impossible to read everything…you have to skim. You have to learn how to cut corners.” This is the same advice that is given in part of the Chronicle article: "...learning to read properly becomes one of the most important hurdles faced by students in postgraduate education—as does, ironically, learning to not read, or to choose to read certain texts incompletely or not at all.” I know this is probably true but, for someone like me, it’s also terrifying. “What do you mean ‘don’t read everything!?’” That’s my panicky internal reaction. Of course, I’ve had to skim a few articles here and there like anyone else. But the idea of making that a consistent method of getting through my work, or of consciously deciding to do this, still scares me. Yes, even now as a 2nd year MA student.
And I know I’m on the wrong side here. I’ve known it since undergrad., when friends worried that my old “read everything” habits would die hard once I got to grad. school. I’ve noticed that people who manage to skim at least some things get much more information than I do. They get to read (on some level, at least) extra or suggested articles, and not just the required texts. They often get through research faster. They’re less stressed out. But I can’t get over this psychological hurdle. Luckily, thanks to the Chronicle article and some people I know personally, I know I’m not alone. But it’s definitely still a struggle to convince myself that it’s going to be okay if, for example, I don’t read every single text on the comps. list.
The more unexpected part of the Chronicle article is actually the discussion of teaching how to read. The authors make a good point when they say that “After the elementary years, schools pay little attention to the mechanisms of reading. We read as if all texts, even the most complex, were Dick and Jane.” And, of course, the texts we’re reading are not like these first grade readers at all (despite the authors’ amusing comparison of Foucault and Goodnight Moon). The expectations are also quite different. Until high school or perhaps even beyond, “reading is regurgitation. In graduate school, reading and the ability to discuss and interpret that reading are simultaneously a means by which a student asserts an academic identity and the basis on which a student can produce new knowledge.” If students don’t learn how to read effectively, and by that I mean both analytically and in a time-efficient manner, this new mode of reading is never fully experienced. 

Essentially, it seems the articles authors’ are encouraging us to accept that we can’t read everything, and focus on reading on a deeper level instead. We have to become real grad. students, not ideal ones. “Skills are always approximate, but an identity is forever,” they say. But if I want to try to read everything on that deeper level, does that make me a bad grad. student? Does it mean my identity is simply that of an idealist?