© Tim Blackmore and Figure/Ground
Dr. Blackmore was interviewed by Laureano Ralón. June 4th, 2013.
Tim Blackmore is Professor at the Faculty of Information and Media Studies at the University of Western Ontario. Dr. Blackmore received his MA and Phd in English from York University. He is the author of War X: Human Extensions in Battlespace and has contributed to the Encyclopedia of American War Literature. He has written numerous articles on war, culture, comics, science fiction, fantasy, and machine-human-cultural interaction.
What attracted you to academia? Was it a conscious choice to become a scholar?
The last thing in the world that I expected (or, in my teens hoped) to become was a university professor. I was a terror-stricken teen living in the claustral world of an agoraphobic with severe anxiety disorder and a good dose of obsessive compulsive behavior thrown in. So how did I get here?
I went to S.E.E.D, a public “alternative” high school in Toronto. If not for S.E.E.D. and the people there, I certainly wouldn’t have passed high school, let alone survive the process at all. Their method, founded in Neill’s Summerhill, was that the students organized the classes they wanted: teachers were there for guidance, not the production of knowledge. I loved to read. I wanted to read novels, and after a while I wanted to know how novels had started. I approached some (what I learned later were) graduate students at York, and some junior professors at the University of Toronto. In astonishing but also typical acts of generosity and kindness, just about everyone I approached agreed to offer courses as “catalysts,” people who would mobilize discussion for a three-hour session each week around a text and topic decided on by the class. I was surprised then, and am astounded now, at just how good people were to us year after year. They talked with, never down, to us. I’ll name some of them, if only to finally register properly my gratitude for the enormous work they put in: Maureen Bostock, Nancy White, Terry Ferguson, Julian Patrick, John Meagher. If I’ve forgotten someone, please add your name, and don’t feel hurt because of my bad memory. Some of these people, particularly Terry Ferguson and Julian Patrick, had a profound influence on my ability to close read texts. At fourteen I was lucky to study with two of Canada’s shining stars in Shakespearean studies, Julian and John (we read one Shakespeare play a week until we were done), and cover a huge range of contemporary and classic novels with Terry, a Melville specialist (because of Terry we read Mishima, Kawabata, Grass, but also then little-known African writers like Amadi and Ewkwensi).
By the time I was fifteen I was working a part-time job, going to see two or three double-features at second-run art house cinemas a week, and reading reading reading. I was and am a very slow reader, so I wanted to put in all the time I possibly could with books. I read all of Anthony Burgess, Steinbeck, Kerouac…you probably have the picture. I always had a love for American literature and culture, particularly a fascination with the South and the Civil War, why I was never sure. Decades later I returned to this obsession, which taught me that passions, even if they go underground temporarily, rarely die. While I was reading myself to life, I was also catching up on years of the private stuff adults talked about. Books, art and film showed me the secret world of adults that seemed inaccessible. I read everything I could, got a head full of film and art, and tried to sort it all out. I was cheap and indiscriminate. In other words, I was a normal, scared teen floating in the world, equally amazed and horrified by what I saw. Remember this was before desktop personal computers, before cable (for most of us), and certainly before VHS tape became commercially available. There was a thriving and wonderful second-run art house cinema world in Toronto, so each week I’d see double-bills at the Cinema Lumiere, or the Revue, or if I could hack the weed smoke and associated headache, the Roxy (the bonus there was that it was 99 cents—my hourly wage was $2.40—so this was a particular bargain). A good week meant I might see six films, maybe more. It took me about two years to catch up with the French, Italian, German and American new wave. There’s a wonderful moment in Truffaut’s Day for Night where Jean-Pierre Léaud, asked whether he’s going on a vacation with the crew, answers in shock that with so many theatres in the city, he couldn’t possibly go away. That’s how I felt. Film pushed me to art, both pushed me to reading. The night after I saw John Schlesinger’s eerie Day of the Locust I ran out and read all of Nathanael West’s four short novels. This was also how I came to north American and European science fiction. A friend’s recommendation to see Tarkovsky’s Solaris made me want to know more, and so I began reading Stanislaw Lem. I didn’t separate film, fiction, visual art, music, but loved them all. One good friend took a wild shot and gave me a copy of Jerry Robinson’s The Comics, a history of American comic strips. I had always loved comic art, particularly Peanuts and Pogo, and this book connected with my growing adoration of animation as an art form. Things began to seem interrelated, even though my self-education was as chaotic and unplanned as adolescence itself. In the early- to mid-1970s comic art was still trash as far as the world was concerned, or iconic trash as reformed by Warhol and Lichtenstein. I sank into comics, animation and science fiction and hid there in the weeds with joy. Somewhere along the way I began writing science fiction, submitting stories to the few remaining magazines that paid for fiction. Nothing was ever published, but I felt like I was beginning to find a way forward.
After learning to handset type and run a small printing press, I finally left high school and wandered aimlessly for a number of months. I apprenticed with some friends who ran a graphic studio, and fell on the work with joy and abandon. I wasn’t as quick as I could have been, but I loved the process of doing rough work and seeing a design, even for something small, develop from a series of sketches through to printed work. Holding printed work made sense to me, made permanence in the world, and brought me unalloyed satisfaction. When I lost that job, I had to go back to the beginning. I was stuffed back into deep agoraphobia that had clouded my teens.
It was only at this desperate juncture that I began to consider school. Now 20 and with no real formal education, I thought about going to university. I was so terrified of going to a real school, felt so ill-equipped to sit in a classroom, that I stalled. I did what so many anxiety-ridden people do and have done—I retreated. I decided to return to high school, complete an official high school diploma, and then see. So it was that at age 20 I found myself back at S.E.E.D, but this time to work as an adult does. It was finally time to satisfy my abiding curiosity about American history and culture. For $75 a member of the public could then apply for a Research Reader’s card allowing one borrowing privileges in the University of Toronto’s extensive library system. I spent each day at the Robart’s library, still one of the finest libraries I’ve ever worked in. Each week I would scan the Library of Congress “E” listing (American History), and I began with books about America when it was a small amalgamation of joint stock companies. Over the course of the year I read my way from E1 through E800 (around Reagan), history and American politics (“F” in the Library of Congress), trying to figure out what it was that made America so intriguing to me. Around March I began to feel a terrible itch to produce some kind of document: just reading made it hard for me to lock in what I was thinking. I had spent more than two years sitting at a drafting table seeing work come in as rough directions, go out as printed material, sometimes within a matter of hours. Here I was sitting and reading and thinking, but doing nothing for my hands. I began to write short papers each week, learning to say what I needed to about various texts I loved or was fascinated by, but allowing myself only 1000 words in which to do it. I wrote little studies of texts as diverse as Huckleberry Finn, Dispatches, and Fear and Loathing on the Campaign Trail. The more I wrote, the more the internal pressure I’d been feeling eased. I realized that I needed to see some kind of end result in any work. I learned that there was a continuing relation between thought and response, idea germ and completed object. To have ideas alone was not enough—they had to assume a physical form.
Harriet Wolff, an exceptional teacher at S.E.E.D, had been the saving of so many of us—me included. Later she nurtured a younger fellow, another guy who wanted to write science fiction—someone eventually known around Canada and then more widely as Cory Doctorow. Harriet coaxed me to take a writing class at a local community college. I went with her each week, and slowly gained the confidence to write more sharply, but most of all, to be in a room with more than 20 people at a time. At each point along the road, someone helped me, and Harriet is one of those people we all need to have if we’re not going to disappear along the way. By August I figured I could probably do one, maybe two, courses at university. I applied to Atkinson College, York University’s night school. I couldn’t tolerate the idea of sitting in classes with 18 year olds who would be sharper, faster, and a great deal more sure of themselves than I could ever be. I took lessons from genre fiction and determined to hide. Atkinson turned out to be perfect. I spent over seven years finishing my undergraduate degree—my world widened and deepened wonderfully—and finally by the age of 29 I was able to move about in the world, and consider that teaching and writing at the university seemed like something into which I could put my heart. So I applied, panics included, to graduate school, the next terrifying, enlightening step. I knew by this time that I would always have fear, but it no longer stopped me. This has continued all my life.
In the early 1980s I realized that I missed the hands-on work to which I was so accustomed in graphic design, overcame dreadful shaking fear, and enrolled in part time classes at the Ontario College of Art (OCA, later OCAD). Although I had drawn all my life, I had never studied art formally. From the summer of 1983 until I left Toronto in 1995, I took at least one or two courses every year. It was mostly wonderful. A war veteran once signed a book to me: “Love art. Love life.” He was right.
Along the way, while people were supportive, very few understood why I was going so slowly, how risky I felt each step was. By the age of 29 I had learned that the way I did things was not necessarily “normal,” but by then I didn’t care. I had to live, and this was the best way to do it.
Let’s talk about your mentors in university. What were some of the most important lessons you learned from them?
There was a key moment, something almost too fictional to be real, about my decision to stay at university. When I went to York, the school was committed to a general education program that came from the University of Chicago’s Core Education design. All students, no matter the discipline, were required to take common courses in the Natural Sciences, the Humanities, Social Sciences, and Logic or Math. The idea made sense to those of us living in C. P. Snow’s Two Cultures who understood very little about other epistemologies. I was aimed at the Humanities anyway, and so took, with great delight, a survey course on what was bluntly referred to as “Am. Civ.” (American Civics) at the time (it would now have five more words and the requisite “culture” in the title). This was taught by a professor who would become my mentor, guardian, friend, and who would, more than anything, teach me to think, write, and be a scholar. He was a Canadian who spent a crucial part of his own education at Duke University smack in the middle of the Vietnam war. Edward “Ned” Hagerman was my introduction to the way the world really worked. His course launched the discussion of the European enlightenment to which I had been an ignorant inheritor, and as a white guy, enormous beneficiary. For all my reading, I didn’t have any idea about how things had come to be, why the world was now in the shape it was. The content of this “Roots of America” course gave me a delirious sense that I finally knew some of what people were talking about. The puzzle of the difference between Catholics and Protestants was much clearer. I began to understand why, for me, work was more important than life, or why, in the parlance of the 1990s, I lived to work rather than worked to live. I had been raised in a late Calvinist culture that prized articulate speech, reading, learning, hard work, excellence, and from men, limited displays of emotion or affection, a hard crust of toughness, no matter how small or frail one was. This now made sense to a little weak kid with glasses who loved to read and draw. The course made the university into the best place in the world—I wanted more of this, and keep it coming. I now officially wanted to own it all. I would walk past the open library stacks with a combination of glee, thrill, and sadness. I was awed by how much there was to be learn, and always had the sinking feeling that I would know only a tiny slice of it. But that was for later. For the first time, ideas were in motion, and Ned Hagerman’s droll, wonderfully thoughtful style and sense of humor kept them there.
Ready to go on to the next two winter courses, I was in the night school looking at course postings. With increasing dismay I looked at course after course, unable to tell what might be good, even acceptable. As I looked, the confidence I had newly built up over the year began to erode. I panicked. I sagged on a couch, panting anxiously. I had no idea what to do next. Finally, I got enough energy to get up and shuffle to the door, ready to leave and not return. As I opened the door, Ned walked through on the way to his office.
As I say—it seems fictional.
Over the course of years I have often replayed that moment, wondering what would have happened if I had been a few seconds quicker getting out the door. It’s one of the few times in my life that panic was a positive thing: it froze me on that couch.
Ned recognized me and asked how things were going: I stuttered along, explaining my problem. Without hesitation Ned began to look through the postings with me, and soon came across two courses, but more importantly, two teachers, I would delight in working with. As I, all shaky gratitude, took his advice, I learned four things: one is best to take courses with professors, not for subjects, that is—follow teachers, not topics; that multidisciplinary courses and their instructors generally privilege modes of thinking rather than rote content, and so were usual and valuable; people who have constructed courses like these have also put together their own idiosyncratic syntheses of the world, which can be a model for the student looking for epistemological frameworks; and to always ask for help, especially if one is a tough perfectionist. Including Ned Hagerman, Professors Barrie Wilson and John Unrau were to be the most important teachers of my adult life, and shaped my ability to reason, write, and take joy from it all. I would never have guessed that a military historian of the Civil and Vietnam wars (Ned Hagerman), a hermeneutic scholar and all-round genius (Barrie Wilson), and a deeply modest, brilliant thinker and poet who prized analysis and elegant writing above everything (John Unrau), would be the people who gave me the tools I needed. I also know that things rarely go the way one expects. As I hoed the graduate school row, I took further and equally significant lessons from Professors Robert (Bob) Adolph, Susan Warwick and Sheila Embleton. These people taught me how to be an academic, and most of all how to write from my own mind and in my own voice. They have left an indelible, invaluable imprint on who I am. Daily I take their advice when I face problems or situations, and remember what they taught me.
Luckily for me, Ned came through the door. Luckily, I was having a panic attack. It all worked out beautifully.
In your experience, how did the role of university professor evolve since you were an undergraduate student?
Despite the warnings about the loss of the academy to the marketplace (I don’t dispute that this is happening), I think the job of professor hasn’t really changed all that much. I’m talking about professors in Arts and Humanities disciplines, where close reading, analysis, and writing are still the main tools of the trade. I know that people will be quick to say that the web has changed things inordinately. I work in an information studies faculty, and we’ve been using the cyborg as our operating metaphor for the last 16 years, so I don’t need to be told about how the students’ (or professors’) embodiment has shifted drastically. I know it has.
I can say that from the time I entered undergraduate school, there has been a systemic and systematic deskilling and elimination of workers at the university. The result is that I do a great deal more mechanical work preparing grades, posting information, doing audio visual material, where once there was support for these things. I use a great deal of A/V, and I’ve learned over the years that it takes me, on average, an hour to develop 1 minute of finished A/V. That includes the preparation of images in some photo editor, annotations, slides, clips of video edited for concision, the gamut. I use a minimum of 10-12 minutes per lecture and two lectures a week, which means some 240-400 hours of preparation of A/V for a new course (based on a 13-week term). That doesn’t replace the lecture, it enhances and clarifies it. It may, if nothing else, give the student a new way to look at the material. I don’t use a slide viewer like PowerPoint with lists of material I’m talking about—that has to be the worst possible way of using slides.
I know that there’s been a great deal of anxiety in the professoriate about needing to be a better performer in order to keep the attention of students now hooked up to a steady intravenous line of media streaming from their smart phones and laptops. I don’t think people have to become comedians or magicians to keep students’ attention. When I went to night school I sat through, as have all students everywhere, some dreadfully dry, bad lecturing. In one case I would take 14 pages of single spaced notes over a three hour period, and the fifteen minute break we had in there was an oasis of sanity in the midst of the grinding wheels of material. At the same time, that course (a Natural Science course) was one of the most useful, helpful courses I’ve taken. I say this having taken the requisite 40-odd full (six-month-long) courses one does to complete the degrees leading up to and including a doctorate (plus the other 18 or so courses I took at OCAD, all of which were as demanding). I know that our students are different people, and that they have very different expectations and demands. But then, the professors who taught me were teaching in the wake of the 1960s’ commitment to social justice and activism. Students of the increasingly anti-intellectual 1980s must have been a terribly dull-seeming apathetic bunch of materialist greed-heads, to use a word from those smoky wild days of revolution. And we were, I think, much more selfish. I wanted to learn, and was selfish about that. The job is still to communicate, to make contact, and to cut through the chatter of our daily lives.
I knew then and understand now that there’s a great deal of tedium in learning. That’s true for everything. It was true when I worked as a typesetter, as a printer, as a graphic artist, and when I paint for pleasure. Does a professor have to become a YouTube star to be a good teacher? Absolutely not.
Where now professors of my generation may complain loudly about the pressures of email, I’ve also seen many of them opt out and refuse to answer students. When I was a student, professors didn’t do email, but how about this? They gave us their phone numbers and encouraged us to call. Because most night school students couldn’t get to office hours, the practice was to give out phone numbers. Compared to that, choosing what email to answer and what to delete seems like a different kind of commitment. Professors worked hard then, marked with whatever rigor they believed in, from extraordinarily thoughtful comments on form and content, to the bare grade circled at the top of the front page (your grade was circled? Lucky!); professors work hard now. Professors need to do what they’ve always have: be on fire with the material. They need to believe that what they’re saying is worth it, and that the joint work of the class is enormously greater than the sum of the parts.
Students aren’t illiterate brainless slugs; professors still prize thought, investigation, and the pursuit of text. It wasn’t better then. It isn’t worse now. We need to get over that as an excuse for why we’re unhappy. If people feel students are illiterate, then there’s a cure for that: more reading and more writing, but there must be good underlying reasons for these things. Will reading and writing breed power and agency the way they did before computer screens swallowed our desks? If every class in first year committed itself to weekly writing, our students would be writing much more fluidly and authoritatively by the end of the year. So the question I have for my peers is not what they’d like to complain about, but what they’d like students to be able to do, and further, how are we all, students and professors alike, going to go about changing that? I refuse to believe it can’t be done. We may need to work harder. If it’s worthwhile to us, really worthwhile (not just something we adore complaining about), we’ll do it.
What makes a good teacher?
Kindness. Students are often scared that they won’t be able to do the work, that they’re in the wrong place, or that someone else, maybe everyone, is brighter than they are. They’re in the right place, most of them, but they need to be reassured of that. Few things have driven that home as much as becoming a student again. A decade ago I began taking voice lessons, having heard my father burn his voice out in the lecture hall. I trained with one of the best teachers I’ve had the pleasure to work with—the Canadian operatic tenor Torin Chiles. Suddenly I knew what it was, again, to be terrified of opening my mouth. But Torin made music and public speaking exciting new things for me, someone who’d been comfortably teaching for then ten years. I remember Torin’s kindness paired with his ability to correct without fuss, ire or judgment.
After kindness, teachers need to have as little fixed idea of what is right, as few expectations as possible about what comprises a good idea, combined with an utter dedication to listening and thinking. We expect that teaching is about putting the right objects in the right order into students’ minds. I’m reminded of the great Maggie Smith’s take on Muriel Spark’s Miss Jean Brodie, who pompously lectures the principal of the girl’s school that the word education “comes from the root e from ex, out, and duco, I lead. It means a leading out. To me education is a leading out of what is already there in the pupil’s soul.” Although initially taken aback, the head mistress gathers herself quickly enough retort pertly: “I had hoped there might also be a certain amount of putting in.” The teacher is there to give common ground, to provide material in which she or he has the specialty. But over all, the job of the university is to make thinking and discussion commonplace. It is to foster a love of people thinking together, whether or not immediate problems are resolved.
If the exchange between student and teacher is going to work, I’ve learned that I must be able to hear students’ voices. If they can’t hear themselves, then they need encouragement. Lots of it. I use weekly writing responses so that students get a sense of their own voices. In kindergarten and grade one, everybody colored, everybody sang, everybody danced—everybody did everything. Nobody stopped because they weren’t professional at finger-paints. But when I ask now how many people are artists or musicians, few dare raise their hands. Our “adult” definition of an artist is someone who makes money, a living, doing the assigned task. That seems to kill the joy in just doing the thing for its own pleasure. Paint. Dance. Compose. Sing. Don’t take money for an answer. When students unhook the expectation that they must first make money at everything they do, then their voices tend to flow. And as that happens, they decipher what is important enough to them to spend time, the best time of their days, day after day, doing.
As much as university encourages debate, it is also full of people who harbour dreams of replicating themselves in their students. That is one of the greatest mistakes we could make. It’s a comfortable class for me if people say things I agree with: disagreement is hard, awkward, difficult, embarassing. But disagreement is also a teacher, and reminds me that for all the conviction I have that my worldview is right, someone nearby disagrees, probably strongly. They may have thought about it deeply and at length, or they may just be formulating the words. I had a chance to do my formulations—why shouldn’t they?
Once students learn that they can say what the believe, that they don’t have to play the game of trying to please me by repeating my thoughts, then the discussions can begin. And as they do, I have to listen hard, watch what I’m saying to make sure that the class can discuss in the open. Is class going to be another ego exercise for me where I’m once more proved right and all other points of view are shown to be fatuous, misguided and ridiculous? That will be a quiet, miserable classroom. The problem of authority is an enormous one, but that doesn’t mean we can quit on it.
The discussion of authority leads me to something that has caused me much concern over the last decade, so I want to talk a bit about theory here, sometimes known as high theory, or cultural theory.
I went to graduate school during the height of the culture wars, where everything I’d worked so hard to digest as an undergraduate (the roots of western thought, ideas, religion, art, aesthetics) were being questioned or pushed aside. Like most graduate students of my generation (naked, dragging themselves screaming down the French theoretical embankment…naturally), I became a theory-lover, perhaps an addict. Once I saw the sense that open interpretative positions made (whether they came from Derrida or Fish), and how much the canon abused women, people of color and class, I worked hard to undo my assumptions.
But not all of it made complete sense. Theoretically minded students who argued that there was no order of priority between Shakespeare and his critics didn’t jibe with me. As intriguing as theory was, could it really replace the plays? If there hadn’t been Shakespeare’s texts, could there be theory about them? Didn’t that mean that there was at least causality, if not priority? This was a discussion to which I have returned mentally many times. Certainly it is human beings who place emphasis on one text over another, humans who construct hierarchies, humans who determine what is to be kept and what will be left for later. If it came down to a choice of studying only Shakespeare or only criticism of Shakespeare, then I would opt for the former.
As my experience with theory grew, I learned that a typical path to publication was to choose a contemporary theorist who was in fashion, apply the theorist’s ideas to a text, and discover what had been “missing.” I also learned that graduate students, who are enormously bright and full of ideas, were scared of being vulnerable. In an undergraduate class, one can be bright but sink into safe anonymity and still do marvellously well. Not in a graduate course where a class of some six or eight students emphasizes not only who has or has not done the work, but who is or is not theoretically adept. The theoretically deficient were considered to be clods, people who lacked the swiftness, the smartness, to grasp complex philosophical ideas usually based on French or German thinkers whose texts were themselves obscure and difficult in the extreme. Stratifications formed in classes. Those who “knew theory” were at the top.
So I learned to think and write with theory. But when I became a professor I realized that as important as theoretical insights could be, they also closed down joy, curiosity, disagreement, and creativity. For all that there was to be no canon in postmodern text, yet one always had to read Derrida, Kristeva, Jardine, Irigaray, Foucault, Deleuze and Guattari, and so on. For all that every fictional text was equal to every other, yet the same counter-narratives kept appearing: if one didn’t have a text by Toni Morrison or Leslie Marmon Silko, or other key authors on a syllabus, then it wasn’t complete. If there was no postmodern canon, then why was it essential to read key texts? Somebody was lying.
As time has gone on, I have become more irritable about theoretical practice in undergraduate and graduate programs. Many professors seem to regard themselves as latter-day Freemasons, protecting the Word, the Signs of Meaning. As in any order, the academic is inducted through a series of trials, and as she proceeds, gains access to more of the Mysteries. While gathering academics into Orders makes one feel comfortable (what a relief I felt once I gained my doctorate, and knew I could at least insist on this paltry honorific among my peers), it also puts crucial distance between the students and the teacher. Now emerges the insecure snotty graduate student who, as someone once said, needs to “admonish” their students about the ways of the world. Like Freemasons originally, academics are more akin to craftspeople, forging single bricks of scholarship (or the mortar for those bricks) and adding to a wall of knowledge and opinion. It’s the work that’s important, not who’s doing it. Unless you live in a celebrity culture like ours.
Theory is crucial to the notion of the academic as an Initiate into the mysteries. Few things separate the Master from the Apprentice more than a dose of deep critical theory. There is no excuse for bad writing—professors and teachers everywhere rapidly (and often angrily) tell their students. But most theory is written in jargon that must be comprehended and absorbed before it can be used. Proof of academic thought seems to be the wielding of theory in order to interpret a given text. Too many times I’ve heard professors lament that a student’s idea was “under-theorized,” as if the application of someone else’s ideas made the work worthwhile. If the ideas about gender, power, race, class, economy, aesthetics, have not come been sifted through a high-theory filter, they won’t be deemed acceptable. In my field, to talk about the reproduction of something must be accompanied by nods to Walter Benjamin and Guy Debord, if not Antonio Gramsci and a host of others. If you want to talk about surveillance and seeing in terms of power, you’d better have read (and acknowledge Foucault), as is true for the position of flows and stoppages of power and rebellion (Deleuze and Guattari), the self (Freud and Lacan), and on and on…and on. Stay alert, though, because the theoretical fashions will change.
These authors are all incredibly insightful and worth reading. But they are not worth making into the Mysteries that the professor knows and into which the Apprentices (graduate students of varying stages) shall be inducted. Theory is a bar used to separate the initiates from the Fellows of the Craft (basically, Journeymen), until they reach the tier of Master. Graduate classes, particularly, too often become an ego game played between all involved about who knows the secrets, who can divine them the fastest.
There are indeed many concepts that are worth learning. But there also is no one way through this process. For people whose pride is in thinking openly and fluidly, the theory prison is as bad as the hypocrisy about there being no postmodern canon. Even as I write these words I know that True Believers will argue that I don’t truly understand theory, or that I am a lost person, someone who aligns with the dead white European patriarchy, that I want to protect my territory. Don’t be confused: the old order had to go, and I don’t lament its passing.
As much as one theorist may help an academic make sense of the world, that academic must also acknowledge that the student may well find similar comprehension and explanation in a very different theorist, even one utterly at odds with the professor’s politics. I wish I could say I’d seen students encouraged to be their own political selves, but it isn’t so. I’ve seen theory used in a mechanistic way to lock students into predictable patterns of thought, to reassure the professoriate that learning is taking place because the student has mastered various layers of Mystery, and to guard against the wild cards that students throw. Students who are unafraid can have ideas that may look as astonishing and new as they may look crazy. Is it a genuinely new idea, or is it some misunderstanding of a commonplace? Was the wheel just reinvented? (And if it were, that still must have been incredibly exciting for that student.) It can be hard to determine.
For all that academics talk about freedom, of disparaging fads and departing from the norm, yet we are afraid of the new, fall into predictable mass behavior about what ideas are worthwhile this year, and ensure that our students get with the program—whichever one it is that season, or in our heads, which may amount to the same thing. The way theory is used is not as a set of tools to solve problems about what might be happening in the text and the worlds that text reflects on. It is not put second. Theory is still primary, the way it was when my peers in the early 1990s, over 20 years ago, discussed the primacy of Shakespearean criticism. Theory is, and has been for decades, a technique (in the sense that Jacques Ellul spoke of la technique, a mechanistic process—and there to support my argument I refer to a theorist whose day has come and gone…what a sorry academic I am) used to close off creativity, surprise, and voice in the very students who will go on to form the new professoriate. And there won’t be a discussion about it: there are too many true believers in universities, places of debate, who brook no disagreement.
A great deal of the passion for theory looks back to the work of the hermeneutic scholar Wilhelm Dilthey, who sought a way to legitimize the work of the Humanities. Dilthey understood that the Natural Sciences could make claims to tell the truth, where in the Humanities, all the scholar could do was amass sufficient evidence so as to attempt to convince people of the position’s correctness. Theory has become that apparent scientific device with its own encoded language, the ultimate appeal to authority, that can prove that one’s view is right. As the name “theory” implies, these ideas are just that: theories. They are guesses.
It’s students who suffer. Thinking is hard work, and coming up with new thoughts is a rare and difficult business. Theory mitigates that problem: the theorist is brighter than you and will speak for you. The theorist can speak and has spoken about any subject (if not, wait a week or so—someone will be looking for a way to be noticed). Hundreds of thousands of intelligent people on every continent agree that a theorist is right to draw some conclusion. The student, instead of having to determine and argue for a position largely on their own, is carried out to graduation on the theorist’s back. To make theory secondary or even tertiary, to require it but then to emphasize original thought, what the student actually believes, is to open to door to chaos. And, I know people will say, to mediocre thought. If what that means is that we’re graduating mediocre minds into the professoriate, and they are only as intelligent as the theorists they can summon, then what can we say of our situation, governed by technocratic thinkers?
Theory also provides the professor with a way to stardom: should they come up with their own theory that earns adoption (now more possible because information has the chance to go viral where before it moved relatively slowly), the academic can be guaranteed a berth in a better school, a major urban center, and perhaps some academic brand on the door proclaiming this an Institute.
Like so many things human beings do, I believe we cling to theory out of fear, fear of being wrong, of being found out, of not having answers. I don’t want to make people angry with these words. I want us to reconsider what we do, why, and how. I want us to ask whether we’re using theory as an assist to our thoughts, or in place of them.
I once asked one of my supervisors, someone who earned his doctorate at Harvard and taught at MIT for seven years before coming to Canada, if he was ever afraid that a student in his first year class might stand up and yell out “Fraud! You’re wrong!” He smiled hugely at me and said calmly: “Every day.” I think that academics are pushed by the desire for knowledge, by their curiosity. They are also, often, quite shy people, and while they are called on to perform publicly in lectures, may have a difficult time with people. Because theory requires a specialized vocabulary, the explanation for which is always that one must use a rare language to discuss complex ideas, since the words represent unique, complex markers in the theoretical play, the nervous academic can hide behind a linguistic wall. What if someone should discover the professor can be out-thought? What if, one shudders to imagine, a graduate student should be brighter and quicker than the professor, and not really need much in the way of guidance from the professor? What if the professor is literally outclassed? It’s hard, if you pin your self-worth on quick intelligence and depth of intellect, to have a threat come from below (in what is absolutely a class system, no matter what protestations are made to the contrary).
The pupil who outshines a teacher is no new thing. It necessary and to be celebrated, as it often is. It is a struggle the first time one finds a student more gifted than one’s self. I remember having to examine my envy, my mutterings about what I had that the student did not. Finding students brighter than we are is a hell of a leap for the teacher to take. And so I began the work of letting all competition go. Life has become infinitely more open and bright since.
How do you manage to command attention in an “age of interruption” characterized by fractured attention and information overload?
Following what I’ve said above, the surest way to make contact with a class is to be as real, as present, and as undeniably human as possible. If you don’t get through all your lecture notes, is it so bad? What if, as you were lecturing, you suddenly had an idea that was worth following, and instead of thudding away with a predetermined lecture, you shifted to a conversational tone and began talking directly about what your thoughts were? People will leave one conversation for another one if they’re being addressed directly, if you speak to them, instead of your idea of them.
To get people to relax their grip on a social network or a texting discussion, we need to carry them away to a new place. It’s going to take work, this new idea. It’s hard to grasp the thing the first time—perhaps impossible for most of us. When I look at my lecture notes, I wonder if I must cover everything I have in front of me. What if I stopped at point (3) of (5) and said, “I’m rethinking this.” What if, then, I laid out the problem in the simplest possible terms, but also my struggles with my own interpretation, why I was in doubt? My experience with this kind of genuine conversation with the class is that you’ll get the people who are ready to be there. Many people in my large classes are there because the course is required, because their friends took the class, or because they looked at the reading list and guessed the course would be an easy, or “bird,” course (birds have a very tough life…I’ve never been clear on this phrase, but it would be amazing to try to survive a day as a bird, let alone take a course in being one). I can’t hope to reach everyone of some 300 students. The people who want to be reached will be. Some can be invited in, and many will take up the welcome if it’s real.
Above all, believe in the material, and be prepared to go off the written script. If you’ve seen documentary footage of film director Robert Altman directing, you’ll have seen him prepare his actors, let them get into character, and then at some indeterminate point that made sense to Altman, start the cameras rolling. At another point evident to him, he would stop the cameras, look down the script and decide that most of it had been covered. We can be too proud of our lectures with their little in-jokes, showing the depth of our learning. We need to be able to set the script aside and grapple again with the content. Our students need to see us thinking and join us in that process.
What advice would you give to young graduate students considering a career in the academy and what are some of the texts young scholars should be reading in this day and age?
Accept immediately that graduate school is a highly political environment, and that holding the wrong belief or subscribing to the wrong theorist can have a severe impact on your future. As soon as you can, find people with whom you click, who strike you as reasonable and fair. Leave the famous supervisors for someone else. A famous academic may well be someone driven by ego and reputation, certainly one with a group of followers with whom you don’t want to compete. There are loads of bright people around you. Each person I learned from had written books, some very well known in their specialties, but none of them in the academic star system. They were bright, funny (humor is a wonderful sign of humanity), and open to new things. And I mean, they were open, not just interested as long as you were prepared to parrot their ideas. Not many people know their names. That doesn’t mean they aren’t astonishing people to learn from.
When you’ve found your people, do your work. Stay out of student politics (if you want to finish your degrees on time), and leave aside romantic notions of what writing a dissertation or thesis involves. Like any book, writing these documents takes steadily recurring, large amounts of time. It is not some kind of final Mystery that requires the rending of veils. A dissertation is a document of your ideas about a problem or issue that interests you. It isn’t part of a romantic grail quest for the Truth. The dissertation is a step toward becoming a teacher and being an ongoing student. The more you write, the more you will realize you don’t know.
Plan to do the work practically. Don’t slog through the comprehensive exams and then decide that you’ve earned a tour of the Continent. You probably didn’t finish second year undergrad and reward yourself with a year off. Sure the comprehensives can be hellishly stressful. Take a week or two off. See your friends, get sleep, see some movies. Then sit down and plan how you’ll finish the degree.
What should you be reading? Read what you love, always. Don’t let anyone shame you out of your joy. Look for the book that will turn over the ideas you’ve had until now. When I was just starting the doctorate I had the amazing experience of reading two superb books back to back: Paul Fussell’s The Great War and Modern Memory, which showed me how to write about war in a way I’d never understood before. It also acted as a prose model for me, and gave me insight into lean, elegant writing. Immediately after that I had the tremendous good fortune to read Philip Beidler’s American Fiction and the Experience of Vietnam. Beidler became an off-campus mentor for me—his writing, droll, insightful, jargon-free, has been an ongoing influence on mine. Beidler is one of the finest academic writers, also just writers in general, I have had the luck to read. But likely these books won’t work for you. Look for the ones that will.
As much as possible immerse yourself in elegant, accessible, intelligent prose. The world is loaded with it. Remember that if the university is to be part of the world, we need to write clearly and well. We should be able to frame complex ideas in ways that make sense to people. I don’t mean there’s a license to condescend to people (that can seem fun as long as it’s not you on the receiving end), and certainly this is no suggestion that you dumb down your work. It takes a great deal more intellect and work to write clearly for a non-specialist audience than it does to write for the Initiates. But if we insist on writing for the clubhouse, we can hardly complain when we’re considered arrogant, out of touch, and elitist.
Over the course of the last 16 years I’ve given hundreds of interviews on different topics that range from the politics of using specific weapons in the wars in Iraq and Afghanistan, to the Simpsons reaching their 20th season. Those interviews have given me a picture of how academics are seen by the public. We’re experts, yes. Also apparently humorless, disconnected from the “real” world. You’ve heard about the Ivory Tower? I gather that I own one. Academics have a lot of work to do to explain why we’re valuable to the world, what we can contribute. If we take a position of arrogance, nothing will change. I’ve come to believe that few of us know other peoples’ jobs. I don’t know what it is to be an accountant, a server at a coffee shop, a lawyer, a nurse, an ultra-sound technician. Every time I meet someone I want to know what they do, how they do it, how they see the world. I’ve found that we don’t ask, and mostly don’t respect, each other’s work. I’ve learned amazing things from supermarket check-out workers, who are some of the canniest people-handlers in the world. Teachers have a huge amount to learn from everyone. If we don’t respect the vast number of ways people live in this world, why should they respect us? Because we were lucky enough to go to school for so long? That doesn’t mean school was easy or cheap. But we didn’t have to quit to support a sick relative, or take a job that would pay off immediately, even if it killed our longer term dreams. Nobody was born destined to be a check-out clerk or a janitor. Or a professor.
Do you think the university as an institution is in crisis or at least under threat in this age of academic capitalism and digital interactive media?
Universities have always been in crisis. If you focus a lot of bright people in one place, and they’ve done a lot of thinking (which professors usually have), they have probably come to wildly diverse opinions about just about everything. Stalin was right to go for the universities first: it’s the most obvious place to find new leadership, to find people who hate going along, who hate saying Yes because everyone else says so.
At the same time, I find it ironic that as professors try to squash mark inflation, their own processes have become vastly inflated. There was a time when a book guaranteed tenure, when two books meant that the professor could, if they pleased, let the reins go slack and plod along. Those days have vanished, and the university has become a factory of production. One book may get you a job. The next will secure tenure. Plan on keeping that up. As long as one is publishing large quantities of matter in reasonably reputable sources, all will be well. Instead of valuing thought that is ready to be seen and read by others, we value thought that can be seen. We may loathe the commodification of thought, but that hasn’t stopped us from using publications and quantity of activity as metrics from which to determine value.
Hanging over us at the moment are the so-called MOOCs, or Massive Open Online Courses. We’ve been heading in this direction for a long time, and I expect there will be a lot of confusion and worry before things begin to resolve. One of the things that’s common to the newly arrived digital world is that we assume that, until now, things have been unchanging and peaceful. Universities have been in a ferment for over a century. Just about each decade has brought significant change to our definition of the University itself, who we will allow to attend and what we will require of attendees. Digital change tends to move at a speed that we can see and measure from year to year, instead of over the course of decades or generations. In the first classes I taught at the University of Western Ontario in 1997, a third of the class had email accounts, a third had heard of email but weren’t interested, and the last third were terrified of it and refused to sign up for it. When those students graduated in 2001, they had no idea what a smart phone was, or were interested in carrying around an expensive small version of something they were used to using in private on a desktop. Few of them carried around the expensive and rare thing known as a laptop. Above all, the word private meant (and means) something different in 1997, 2001, and 2013.
If we believe in education then I think we can accept that it’s going to need to take a lot of different forms. Connecting with students as people, knowing what they struggle with on a number of different levels, means I can best match how I talk to them. I hope that what I’m introducing will be as accessible as possible. That doesn’t mean that other methods of teaching are invalid or worthless. I believe in people, just the way I believe in craft. I believe that we could have, if we were prepared to accept a much slower way of living, an economy where carpenters consulted with us about what kind of furniture we wanted, what suited our needs, what hand made furniture, or textiles, or pottery, we wanted to have in our lives. But we would have to forfeit a mass way of life in order to do that. For us to give up on MOOCs, or decide they were a potential attack on the way we think about teaching, we would have to decide that education for everyone at every possible level really was something worth doing. MOOCs are another magic bullet, the promoted solution to the global, not just provincial, state, national, continental, problem of education. All the previous magic bullets (remember that videotape was also supposed to level the educational playing field) have left a wounded public body. We can indeed have global education, and just as with other things we’ve done globally, likely it will benefit those who already get the best of the high end, who already find themselves (even if they don’t think so), among the very privileged few. The MOOC is the latest in an ongoing series of iterations of how we will, apparently without any cost to anyone, produce literacy throughout the known universe. If we need to gut our present university system and reduce our actual teachers to a tiny string compared to the legions we really need, then in the name of the Ikeaing of what we do (the furniture may not be great, but it isn’t terrible is it? And you can get it in four different finishes), we’ll do it.
Do you think there is a distinctively Canadian school of communication? If so, what are its distinguishing characteristics?
For a while there really was a distinct group marked by Harold Innis and Marshall McLuhan. Most media texts I read now sound very like each other, no matter where they come from. Australian media texts discuss the same issues as those from Britain and the United States. They all have slightly different forms of reference and use different cultural examples as they illustrate the problems they discuss, but I think the work media scholars are engaged in, no matter where they place themselves politically, are familiar and agreed upon. Although the academic world is pretty good about updating itself and spreading new ideas around, the sheer speed of digital communication has broken down more scholarly barriers. That doesn’t mean that each country doesn’t have a tendency, a leaning toward a local color in media thinking. Politics always shades how people see the media, how we understand the notion of a “free” media (instead of a free press). Canadian media scholars like Robert Babe and Nick Dyer-Witheford have done a great deal to persuade us that corporations ineluctably control most of what we see and read. But good ideas are good ideas no matter where they come from: I hope we’ve been able to leave behind who said what and instead focus on how to solve the problems we have before us. If nationalism makes for schools of media theory, then it’s hurting us as much obedience to any set of unassailable ideas.
What is communication anyway? Do you think communication studies should be a discipline in the first place – concerned as it is with a sort of nothingness, i.e., the invisible effects stemming from technological, material and symbolic environments?
Communication is us. We can’t stop ourselves from sending and receiving a constant string of messages to the world around us, whether there’s someone there to interpret them or not. We’re perpetually leaping up and down on the branches yelling in shock, amazement, surprise and hilarity at everything that goes on—we’re pure reaction. It makes sense to me that we have such a thing as communication studies. At the same time, I am a dedicated believer in generalists, and in training people to synthesize and bring together unlikely subjects. As soon as we cordon off something we make knowledge territorial. The world comes at us with problems that are unlikely to be addressed by any one discipline. As we’ve carved the disciplines into smaller units, fostered in each of them their own private languages (usually for the purposes of, ironically, easy communication), we have made solving problems inordinately complex. One of the things the ecologically concerned often have to remind geographically fixed politicians is that water doesn’t observe any but its own boundaries, and that ecospheres aren’t coterminous with national boundaries. Politicians rooted to a local constituency, even one as large as a continent, must give up regionalism if the problems are going to be treated. If people were prepared to advise from a discipline but equally take advice, and more, epistemological positions, from their peers, then I think we’d in a better condition to meet problems. Few things are more discouraging than to see scholars who agree about so many crucial issues then fall out and fight bitterly over terminology and connotation.
Communication Studies means that there’s a zone for specific discussions about how things are done, and what each small morsel weighs. The reversal is that disciplinary boundaries often create no-go zones or, more dramatically, ghettoes, where only the faithful enter. If Communication Studies becomes another way of saying Public Relations or Advertising or Data Mining, then the point will have been lost. We’re so used to driving for efficiency that we balk at the idea of replicating the work of other disciplines (which may also be possessive, and justifiably worried that the untrained get it wrong). In fact we need to know how History understands communication, just as Communication Studies needs to perform its own historiography. This goes for all disciplines.
In Slaughterhouse-Five Kurt Vonnegut says, “Earthlings are the great explainers”—those explanations occur through our various sign systems, whether visual, aural, or otherwise. Yes, we need to study the details of how we go about explaining, how those forms can be influenced, compromised, how they can assist us. But we have to remember the lessons of ecology: communication doesn’t follow obvious lines of force that make a great deal of sense. We hate that, because it’s puzzling, scary, frustrating and illogical, like so much about human beings.
Your primary area of research interest centers on the ways that culture and cultural products revolve around each other. You write: “When I study popular, mass, and folk culture…I’m looking for the ways in which we tell stories to each other.” Obviously, the stories we tell each other contain an embedded ideology, yet as Slavoj Žižek once claimed, ideology could also be thought out as the “unknown-knowns” – those things we don’t know that we know. How do you approach ideology through story-telling?
Each culture is locked into its beloved stories. It could choose to change them, although I think many people assume that recognition, identification and critical analysis of cultural narratives also means improvement will naturally follow. The more I study popular culture the more I’ve come to believe that not only do cultural artefacts like The Help or Avatar make entertainment for people, they also make people feel that something (it’s not exactly clear what) is being accomplished. At least “the truth” is known more generally, now. The Helpshowed us that black people like helpful white women if they’re young, sweet and beautiful. Avatar showed us that we can continue resource depletion because if we do it properly and are helped by altruistic blue-skinned natives, we can enter an Earthly (almost) paradise. Popular culture can spin dirty cultural straw into narrative gold. Then, tomorrow morning it’s off to work as usual.
I recently spent a summer writing about American war films depicting the wars in Iraq or Afghanistan. It’s early yet to be looking for the great films that might have cultural clout similar to Platoon, Apocalypse Now! or Full Metal Jacket, but my guess, after closely reading the over fifty films about either war (leaving aside the 9/11 tribute films), is that the United States and Canada have no interest in even bothering to create myths about these wars. They rank so low in interest that I’ll be surprised if students in one or two cohorts (that is, four or eight years from now) have more than a vague sense that the world, and North America, was involved in fighting two wars in those countries.
Mass entertainment is a reassurance engine. If we look for irony, I’m sure we will find it. But we may find it in places where only a few million people look. The web has made it possible to make money from relatively tiny niche markets, has made it possible to address very specific demographic groups. But just because different narratives are available (I was able to actually lay hands on all fifty films I wanted to see entirely thanks to the web: if I’d had to rely on films coming to theaters near me, I might have been able to see 10% of them) doesn’t mean those narratives are widely spread, discussed, accepted or, more than anything, acknowledged. The mass market is still the big game, The Show. We’ve gotten a lot smarter about allowing ourselves to look away. That means that even those “unknown knowns” recede further from us. We all have excuses—life is tough enough, and it’s hard to look at what is happening to the weakest or most vulnerable in society. If we do admit there might be a problem (with off-shored sweat-shop non-union back-breaking labor that causes thousands of deaths on our behalf each year, as recently came to light—again—in Bangladesh this time, somewhere else next time, but always somewhere), we’ll consider it resolved when the largest stake-holders (Wal Mart and H&M) sign toothless documents promising not to let it happen again. Not only do the issues get further away, but our compassion for the people suffering in those issues seems to have dropped. At the heart of Communication Studies is the idea that we can reveal that which makes our reality—discover the ubiquitous unknown knowns that we’d rather not contemplate. I have to emphasize, though, that revelation is the barest beginning. Identifying our adored narratives (about democracy, the free market, social control of public issues) tends to make those myths only stronger and more robust.
If I had to take a guess at how to go about actually reworking the narratives and their attendant outcomes, I’d start with the issue of children. People believe that it is their right and duty to have children because biologically so many people can. That doesn’t make it a good idea, though. If we begin by considering what expectations we hand our children, how much we expect them to have in terms of clean water, food, fuel with which to heat, cool, transport objects and beings, we might realize how much our narratives tell us we’re entitled, what we’re owed.
Of course, such commentary may elicit the “crazy talk” reaction: we’re all for examining our problems and issues, handling, maybe even reducing, racism, sexism, classism. But let’s not go nuts about the rest, Tim.
You’re also interested in War as a cultural phenomenon. For instance, your book War X: Human Extensions in Battlespace deals with war technologies and the body in the 21st century. I wonder how your notion of extension is similar or different from that of Marshall McLuhan or Don Ihde…
Ironically I probably line up more with McLuhan than Ihde on these matters. I think McLuhan understood that things were probably already out of our control. It has been intriguing, over the years, to see the debate evolve on the prosthesis of the self. But my vision of the way things have already gone and will go is a great deal grimmer and less chipper than Ihde’s. As with the metaphor (if that’s what it was) Haraway proposed in Simians, Cyborgs and Women, the reality of being a cyborg (or a goddess, the two possibilities that Haraway proposes) hasn’t worked out the way feminists hoped. Even her second and less playfulModest_Witness@Second.Millenium didn’t stare down the bleak corporate future that was already upon us at the time. I am much more of Jacques Ellul and Paul Virilio’s worldview that we have already been drafted, whether we are prepared to admit it to ourselves or not. This answer connects to what I’ve been saying about children: it isn’t just that no child is born free, because all children are born into a necessary matrix of language and signs that will keep them alive (for varying periods of time). But there are cultures like ours that have exchanged a relative degree of technological separation for a nauseatingly tight coupling between the body and machine. I expect to see the disappearance of the smart phone within the next five years as technology will literally slide under our skin. The underlying narrative, the mythic story that popular culture tells us, is that the individual is the boss of it all, and can have things its way. The reality is otherwise. In every direction we are closing in on ourselves and seem baffled by our own apparently contradictory panic, desperation, and desire for ever more speed. At best we are the underbosses.
You write: “9/11 has brought neither military nor policy surprises when we look at the last ten years of killer culture: the Revolution in Military Affairs is being televised (so you have to adjust your set).” Looking back at the semiological reductionism of the 1980s, how do you feel, in the context of war, about such postulates as The Gulf War Did Not Take Place? What are the risks of overemphasizing symbolicity over other pillars of world [and] self, such as embodiment, sociality or temporality?
Luckily the tides of theoretical excess have ebbed a little. Most scholars seem to be desirous of looking at actual things instead of clever theoretical constructions of them. It may be that Baudrillard’s essays were useful in establishing the ground rules for how we deal with war as represented in the digital hyperreal. Virilio has taken those thoughts much further and produced work that is more helpful in understanding how the Revolution in Military Affairs recreated the United States’ military. The Navy’s reliance, largely because of the influence of leaders like Arthur Cebrowski, on network-centric warfare made for a lot of early, easy hits on military thinking. Real discussion takes longer: this is the problem I have when a serious ongoing issue becomes the academic flavour of the month. It may not help war (sometimes euphemistically “conflict”) studies to have an academic Angelina Jolie descend from the heights, emit some sound bites, and then return to Olympus.
My work has taken me away from pure representation and to the things in themselves—the weapons, the policies, the designs that operate directly in contact with people, that people touch and use on a daily basis. I still do work that is concerned with representation. That is where I began my thoughts, and much of my work begins there, or acknowledges the territory. But for over ten years now my main focus has been on what is embodied, what is made, what we produce, and then how we construct myths around those artefacts.
What are you currently working on?
Despite my slowly turning focus, I have been clanging away for years at a book on war and images. So much work has been done on propaganda. Famous war posters from the late-19th and whole of the 20th century are inordinately powerful, often aesthetically beautiful, arresting graphic texts. As a sometime graphic designer, I am repeatedly impressed by the posters of 20th century modernism. Sam Keen’s insightful Faces of the Enemy does an excellent job of deciphering how these texts work, what they have in common and why they have had such staying power. Certainly a number are icons, as much as Marilyn’s skirt and James Dean’s white t-shirt, but Keen figured it out. As I was finishing my first book, which is about war technology in the 21st century, I began to get interested in the various ways we use images to make war attractive. Particularly after 9/11, the splattering of camouflage patterns on everything, especially women’s fashion, made me reconsider how Jacques Ellul’s “integration propaganda” infuses the cultural air we breathe, the reality we accept, our normal. For a while the flag was shown to excess, wrapped around the country, if not the continent. Then it became camouflage clothing, right down to camouflage underwear. Desert relish (the name for digitally randomized pattern used for new camouflage garments) but in pink, for little girls. As I looked around and saw the otherwise physically uncomfortable gas-guzzling rough-running humvee become the new war utility vehicle of patriotic choice, or in L.A., the rare, exorbitantly costly Mercedes Gelandewagen (the German military’s official troop field vehicle), I understood that war makes for very attractive fashions. That produced my next plan for a book about different kinds of images at war.
The book has been interrupted, an interesting process in itself, by unfinished business. Each time I moved toward the new material, I found left over work fromWar X that needed doing, and so for a number of months or research periods I put aside the newer work to round out the rest of the book. I had never written a chapter about nuclear war, but I needed to. I was curious about the physical future of the gun, and so had to step aside and take that up. I was perplexed by the way fast-moving contemporary action cinema had changed since the mid-to-late 1980s, and became curious about how special effects might be used to naturalize and sweeten extreme violence in film. Each of those excursions cost me progress on the current manuscript, but also proved to be necessary so that I could move ahead. The more I work, the more I find that while I generally think I know what I’m doing, there’s another much more active part of me that truly understands what I must do next. It is another reminder of what a strange thing it is to be a human being.
© Excerpts and links may be used, provided that full and clear credit is given to Tim Blackmore and Figure/Ground with appropriate and specific direction to the original content.
Ralon, L. (2013). “Interview with Tim Blackmore,” Figure/Ground. June 4th.
< http://figureground.org/interview-with-tim-blackmore/ >
Questions? Contact Laureano Ralón at email@example.com