Interview with Peter K. Fallon

© Peter K. Fallon and Figure/Ground
Dr. Fallon was interviewed by Laureano Ralón. December 30th, 2010.

Peter K. Fallon is an Assistant Professor of Journalism at Roosevelt University in Chicago. He has more than twenty years of professional experience in television production, video editing, and electronic journalism; over fifteen years of teaching experience at the university level; a significant record of scholarship and publication (his most recent book is The Metaphysics of Media), and a terminal degree (Ph.D.) in Media Ecology from New York University. He is the current editor of Explorations in Media Ecology, the peer-reviewed journal of the Media Ecology Association.

How did you decide to become a university professor? Was it a conscious choice?

Of course it was, at some point. The problem was not so much one of decision as of discernment, of seeing the various choices that were, in fact, before me. We do not live in a culture that much values either education or educators, and many young people will choose a pay check (especially a nice fat one) over a life that offers rewards more spiritual and intellectual than pecuniary. Thirty-five or forty years ago I had no idea I’d be doing what I’m doing today. And I probably would have laughed at the prospect. But the cumulative effect of my life experiences over the years pointed me in a direction and provided me with evidence that it was the right direction. I once thought I’d be a musician (my first college major was music theory) and later I wanted to be a recording engineer. My first real job was making educational and training videotapes for the New York State Office of Mental Health, and then I spent nearly two decades with NBC News. But all during this time I was in and out of school (mostly in, in retrospect), working first on a Master’s degree in Communication, and then on my Doctorate in Media Ecology. And I loved every moment I spent in the classroom as a student.

I had some opportunities to teach as an adjunct along the way and I loved sharing what I was learning, especially in my work at NYU, with my own students. My own teaching, I hope,  is influenced by some of my best teachers who were able not only to give me objective information but were also able to convey (or, perhaps, unable not to convey) their fascination with a subject. And unlike my work with NBC, which was seen on a daily basis by several million people to whom I remained completely anonymous, and whose effects on those millions of people I shudder to think about, my work with a handful of students was immediately gratifying, immensely rewarding, and unambiguously positive: I help them to see things deeply embedded in their culture that were previously invisible to them, and I help them to thinkabout those things through systems of critical thought they had never encountered before this class.

One last point about this particular conscious choice: in order to accept the offer of my first full-time teaching position, I had to leave NBC News and take an enormous cut in pay. To be precise, I took a $60,000.00 cut in pay. I can assure you I didn’t make this decision lightly or without deep, deep reflection. So I had a whole series of conscious choices to make along the way. Eventually and inevitably, I believe, they were all leading me to teach. It would have been a terrible personal tragedy for me to have ignored all the experiences I had in the course of this journey.

In your experience, how did the role of university professor evolve since you were an undergraduate student?

Well, that’s the funny thing about “the social construction of reality,” isn’t it? Reality never changes. It is our attitudes about reality that change. Technologies play an enormous role in our socially-constructed reality because they are the media that constitute the message of our society. New media restructure human relationships and therefore human priorities. They restructure both the things we think about (or not) and how we think about them. Media, as Neil Postman said, are epistemologies. But they are epistemologies with firm ontological foundations – they are extensions of our senses (or, in the case of most of our media, only two of our five senses). We tend to know what is real to us and what does not appear to be real we ignore. And so roles change because situations appear to have changed because our media have changed.

But what has changed? The truth is that nothing changes. As it says in Ecclesiastes (1:9), “What has been will be again, what has been done will be done again; there is nothing new under the sun.” Or as Shakespeare put it (Sonnet 59), “If there be nothing new, but that which is hath been before, how are our brains beguiled…” It is we who have changed, we who are beguiled by technological change, we who have ceased to believe that a certain situation exists while beginning to believe a new one has replaced it. We still love and hate, suffer and feel joy, resent and admire, covet and sacrifice. We still allow some with power to exploit and marginalize others without power, and we still look on quietly, feeling bad about it all but doing nothing. Nothing at all changes when new technologies are introduced into a culture. Nothing changes but our attitudes about what is and is not “real,” what is and is not “important,” what is and is not worth knowing. And we change because we choose to change, because media, as McLuhan tells us, are nothing more than extensions of us.

It could not be any other way. One of the things I have grappled with over the years (as do most media ecologists) is the problem of technological determinism. The term is frequently tossed around quite carelessly, I think, and many times is so tossed in lieu of a reasoned argument against theories of technology and social change. But still it is there – the accusation: determinist! – and the serious media scholar will consider the possibility that a certain idea is, after all, deterministic. I have always been a believer in two truths, one immanent and one transcendent: the virtually limitless potential of human intelligence and the possibility of free will. Anyone who believes in either of those truths – but especially anyone who believes in them both – will seek other explanations for phenomena than deterministic ones. Nothing has to be. Whatever is is because we’ve either actively made it so, or allowed it to exist without our resistance.

All of this is to say that many of the role changes we’ve seen and experienced in academia – and in “the real world” – are entirely unnecessary, a reaction to nothing more than an appearance of change, and a more or less unquestioned assumption of the reality of that change, that has become part of a socially-constructed (as opposed to objective) reality. What I thought was important as an undergraduate student three and a half decades ago is still important today. The basics never change. It’s really as simple as that.

When I was an undergrad, those “best teachers” I referred to in the previous question were concerned with making me think, and making me think critically, and they provided a framework of theoretical knowledge and principles derived from that knowledge to help me do so.

They demanded that I read. It didn’t matter whether I thought the text they were making me read was “relevant” to my life or not. They knew that I was in no real position to make such a determination. They knew that the relevance of a text would not necessarily be apparent to a young person of eighteen, nineteen, or twenty years. Relevance can be judged only in the light of experience, never decided beforehand. Such a decision reflects culturally manufactured desires, not a mature acknowledgement of human needs.

They demanded that I not only have an opinion (for who on this earth has no opinion?) but that I be able to support that opinion with reasoned argument, logic, and evidence, and role modeled behavior that gave me the confidence to be tolerant of other opinions, but unafraid to question and challenge them.

They demanded meticulous care in my use of words. Words, I learned, were the brick and mortar of reason. Every rational judgment I made would be made in words and expressed in words and I’d better strive for precision or be prepared to live with sloppy thought and poor communication skills.

But equally significant in my experience, my best teachers also showed me that they cared about human communication, that it is fundamental to our humanity, and that it is important enough to study it so that we can not only understand human interaction (for all human behavior is symbolic) but also improve it when it falls short.

That was what my best professors did four decades ago, and that is what I believe the best professors do today. None of this has changed. This is objective reality. But all of this kind of begs your question which is essentially that in our socially-constructed reality, new media have presented to us new roles and redefined relationships requiring new “skill sets” and ways of learning. And I believe – and there is an awful lot of objective data that supports this belief – that it our acceptance of these presumed new roles and relationships that account for at least some of the failures in American education in the last few decades.

As I write this, I have learned that one of the greatest teachers I’ve ever known, my mentor and dissertation chair at NYU Christine Nystrom, died yesterday (December 22). I am saddened by this news almost beyond words. It is difficult to explain just what Chris meant to me as a young scholar full of questions and uncertainty. She embodied the qualities I’m talking about and inspires me still to provide my students with the lessons she gave me: observe, question, think critically, think clearly, write clearly, and above all care. If we ever allow the role of university professor to “evolve” beyond these simple ideas, then God help us.

What makes a good teacher today? What advice would you give to young graduate students and aspiring university professors?

I believe I’ve already answered that, although I certainly admit that most people today would be inclined to argue against my point. What makes a good teacher today is what has always made a good teacher: command of a subject, a critical mind, a demanding nature, and an ability to inspire students to pursue knowledge for some end beyond mere financial rewards. A good teacher might be entertaining and funny, but shouldn’t set out to be. A good teacher may have broad experience with and skills using technology, but the mere possession of such experience and skills doesn’t make one a good teacher.

My advice to people who want to teach is pretty simple and very likely to be ridiculed: don’t believe the bullshit. You’re not there to help students get skills for a workplace. You’re not there to make them more marketable. You’re not there to provide them with answers to petty, superficial questions. You’re not there to impress them – or yourself – with the latest technological wonder that promises to make something “better” but will probably only shorten some algorithmic process and benefit an employer. You’re not there to mass produce replaceable parts for the machinery of the global economy. You’re there for one reason and one reason only: to make them better people than they were when they came in.

In order to do this, you’ll have to push them, prod them, cajole them, anger them, question them, and make them question themselves and their own previously unquestioned assumptions about the world. You’ll have to butt heads with your colleagues, your school, your administrators. You’ll have to be prepared to explain yourself to others who will want to know why you appear so out of sync with your culture. It’s not easy, so you’d better get used to it.

Or, like far too many university professors today, you can aspire to nothing more than merely cranking out more cogged wheels for the machine, being a servant of the technological society.

As usual, the choice is ours to make.

I know you are a practicing Catholic, as was Marshall McLuhan. How did your Christian beliefs influence your career trajectory, your research interests, and your perception of communication studies as a discipline?

I’ll go you one further: I went to Catholic grammar school too, Cure of Arts School in Merrick, New York. I was educated by Dominican sisters in both grammar school and high school, and eventually spent ten years teaching alongside them at Molloy College in Rockville Centre, New York. My wife Mary Pat and I are both Dominican Associates – lay members of the “Ordinis Praedicatorum” – the Order of Preachers. But I think it is dangerous to look at someone’s religion or their religious background or their upbringing in general and assume, “Ah-HAH! That’s why they came out the way they did,” as though two children of the same parents raised in the same household can be expected to think and act the same. On the contrary, it took me years – decades – to “find my faith” and I’m still working on a day-to-day basis to figure out what that faith means. Indeed, your question misquotes me. I usually describe myself as “a practicing Catholic, and I’m going to keep practicing until I get it right.”

I’m actually a very bad Catholic, or so I have been told by people who define themselves as “good Catholics.” I come into conflict with my Church in a number of areas (it is not necessary to detail them here). While I don’t agree at all with the category or the label attached to it, I am considered by some a “cafeteria Catholic,” which is to say I have real problems with (among other things) the principle of Papal infallibility and patriarchal structure and tend to choose which teachings sound to me to be authentic reflections of Divine will and which ones seem little more than the whims of fallible humans (men, to be clear). I’m not proud of this tendency and I constantly question my own motivations for thinking as I do, but I am in no way ashamed of it either. It is part of the person I am and if there is a God (and I believe that a God exists) then I would be doing an injustice to that God’s creative power to be anything or anyone other than what and who I am. For better or for worse – and may God forgive me for my arrogance – I will not be swayed by the teachings, the traditions, or “the magisterium” of a faith simply because the culture that faith engendered and continues to support says I must be.

So, no, I don’t think my faith influenced my career trajectory, my research interests, or my attraction to the ideas constituting the meta-discipline of Media Ecology. I think some part of me that I can’t describe, that I’m not sure exists, that doesn’t reside in any organ or system of my body, but that still makes me who I am influenced my interests, made me see Media Ecology as a system of investigation uniquely suited to examining interactions of technology and cultureandeventually led me to a place where I felt compelled to embrace a particular faith. Again, I don’t see anything – not a technology, not a culture, not a language, not a religion – as having that sort of power. That, to me, smacks of determinism. I’m not quite as convinced that we are not genetically predisposed toward or against certain attitudes, stances, strengths and weaknesses, etc., but that’s another story. What I’m saying is that I chose Media Ecology, I chosewhat subjects interest me, and I chose my faith – and all of these choices came after decades of questioning, longing for answers, uncertainty, confusion, etc. And I continue to choose them on a daily basis.

In 1999, Eric McLuhan edited an interesting book with a provocative title – The Medium and the Light: Reflections on Religion. Is God the Light?

Well, that question just packs metaphor upon metaphor, doesn’t it? Both the word and the idea of “God” are metaphors for something we can’t begin either to understand or to even imagine. “God” is the purest act of the human imagination, arguably the earliest of our inventions. Pure metaphor. And “light” or, even better, “the light” is a metaphor perhaps even more primal than God. So you’re asking me, in essence, if one metaphor constitutes another metaphor. My answer is yes, metaphorically speaking.

What is the use of “God” if not to stand as a metaphor for the goodness that human beings find in the core of their own beings? “God” is goodness and right. “God” is creation and, by extension, the creative urge. “God” is the organizing principle of the universe, the “logos,” the Divine, Cosmic wisdom, negative entropy: perfect and unbounded love. “The light” is the good, the opposite of darkness (and, therefore, by extension, of ignorance), the source of life, provider of warmth and security. Light is truth. And the source of light (metaphorically speaking) is love.

So of course there are multiple layers of overlapping meanings between these two metaphors and it should come as no surprise that we’ll use one metaphor to explain another. I have no problem saying, within this context, that “God is the light” any more than I would saying “God is love.”

Indeed, the electric light has always been a privileged medium to media ecologists. For instance, at one point in the documentary Picnic in Space, McLuhan turns on a flashlight and remarks that light does not have a point of view; that it radiates in all directions at once, having a spherical, auditory character. He believed that the electric light was the only medium that had no content – the only medium whereby medium and message were the same. This is very phenomenological, I think. Unlike the empiricist notion of consciousness as a passive absorption of sensory impressions bombarding us from the external world, phenomenologists regard consciousness as transcendental, i.e., as pointing outward into the world. In a sense, both the electric light and consciousness could be viewed as a sort of nothingness. Consciousness, by way of intentionality, emerges attracted by something other than itself, while the electric light becomes transparent and withdraws from our conscious awareness to create an environment that permits us to focus on specific entities other than on itself. Aren’t our invisible environments then, much like consciousness, a sort of room-making nothingness which pierces through the heart of being?

Neil Postman talked at great length about technology answering the question “How do we do something?” but that it was up to philosophy – particularly the field of ethics – to answer the question “Why do we do something?” The history of science, in general, is the history of our growing understanding of the physical world, and empiricism has played a central role in this process. The history of technology is the history of applying that understanding of the material world to solve some sort of problem. The history of philosophy is the history of human groping for meaning in the raw data of material reality. But raw data presents itself to us as more or less objective, and meaning can appear very, very personal.

All of this is to say that I’m not sure I can answer your question satisfactorily in the way you asked it. McLuhan’s observations about electric light (like all of McLuhan’s observations) are not terribly objective and don’t lend themselves to empirical investigation. That is both their weakness and their strength, as Lance Strate has reminded me time and time again by emphasizing the heuristic playfulness of “the probe.” The fact that electric light is pure content (rather than having no content) is an illustration of McLuhan’s aphorism “the medium is the message” – the significance of electric light is its restructuring of the day and of the traditional human understanding of time and its utility.

But no artificial light is omni-directional; it is in the nature of artificial light to exist in artificial space, and there is always something or someone holding on to it or hanging it. There is no three-hundred-sixty degrees (cubed) with artificial light. We might also consider the laser which is focused electric light, a single point rather than an infinite number of points (cubed) and is nothing if not subjective. Point of view is part and parcel of artificial light, and let’s not fool ourselves into thinking otherwise regardless of McLuhan’s heuristic and playful (if sometimes dangerous) eloquence.

Nor does light resemble consciousness, I believe, in the way you suggest. To paraphrase the words of my friends Michael Quirk and Stephan Mayo, we don’t “radiate” consciousness. I honestly wish we did. If human consciousness were like radar – another form of light – and radiated out from us and brought back to us impressions of things of which we would not otherwise be aware, we would be, as a species, a lot better off than we are now. Consciousness is indeed intentional and that is both the strength and the weakness of human intelligence. For our intention can be to use our technologies to build a comprehensive understanding of our world and to address its problems, or our intention can be to use our technologies to create our own comfortable, self-sufficient, solipsistic worlds and to ignore the objective reality that surrounds us.  I write at some length about this in my book The Metaphysics of Media.

The point of all of this is merely to note that McLuhan was engaging in metaphor to illustrate something that was difficult to express in objective, literal terms. And so was Plato, and Aristotle, and Aquinas, and Descartes, and Kant, and Hegel, and Husserl, and Sartre. Or if they weren’t intentionally engaging in metaphor then we must understand that they ought to have realized that their ideas were, in fact, models – metaphors – that attempted to explain the interplay of matter, imagination, and mind but were doomed to be pale, inadequate reflections of a reality that we are not, at this moment, fully able to comprehend. Each of these metaphors focuses our attention on a specific dimension of human experience. They are neither entirely wrong nor entirely right.

What can you tell us about your most recent book, The Metaphysics of Media? Other than the invisible environments and effects that media ecologists strive to raise to awareness, what ‘exactly’ is metaphysical about media?

Again, media are really nothing more than extensions of us. It is we who are metaphysical. We believe and refuse to believe. We believe in things that have no physical nature, no material reality, and we refuse to believe in them. We believe in things that not only have a physical, material nature but are also empirically measurable, and we refuse to believe in them. And our media play a role in this.

The Metaphysics of Media is predicated on the observation that different media throughout human history have engendered and supported different conceptions of reality and, consequently, un-reality.

It is difficult, in a few short paragraphs, to go into the argument in great depth or to cite the evidence I provide in some detail in the book. But I’ll give a thumbnail sketch: the era of primary orality is marked by animism and monistic pantheism; a single realm of reality imbued with the supernatural. Magic, science and religion (to cite Malinowski’s title) intermingle as immanent reality and transcendent reality appear inseparable.

The development of writing systems – especially the alphabetic – and the onset of literacy create a rift between immanent and transcendent experiences. Oral tales of transcendent experience become sacred scripture and orthodoxies are formed, while at the very same time the fixity of speech in space lends objective distance to thought (as Walter Ong pointed out). The rift deepens and expands as literacy itself deepens and expands following the development of movable-type printing. Three “camps” appear, each one championing its preferred metaphysical orientation: those who believe only in an immanent reality (natural philosophers, scientists, etc.), those who believe in a dualistic reality with the transcendent trumping the immanent (theologians), and those who desperately try to bridge the gap between faith and reason (Thomists).

In the technologically-developed west, the breach between transcendence and immanence becomes irrelevant in the era of electricity as transcendence itself all but disappears. Propositional structures of thought fall to the presentational, reason gives way to emotion, fixed, objective point of view cedes to subjective personal experience, a “secondary orality” arises. We fine-tune our technologies to bring us only the information we want, and ignore much of the objective reality of those parts of the world (the vast majority, in fact) who do not enjoy the same level of development as we do.

At every point, I must emphasize, our technologies act as instruments of our will. There is no determinism involved. If Americans are largely ignorant of the world, it is not the fault of the media that saturate our lives. It is because we choose our ignorance, and use those media to facilitate it.

You are an active member of the Media Ecology Association. How did media ecology as a subfield within the larger discipline of communication studies evolve since you were a doctoral student at NYU?

I am at the moment an active member of the MEA and certainly wish to remain so. I was not always so active and can’t really predict what the future holds in store for me. The “vicissitudes of life” and all that… Others have been far more active than I have and are largely responsible for shaping this organization into what I believe is an incredibly important “clearinghouse” for scholarship and collaboration. Lance Strate, Thom Gencarelli, Janet Sternberg, Jim Morrison and so many others have been there since the beginning and we all owe them a great debt. They can describe in far greater detail the evolution of the MEA as well as the meta-discipline of media ecology.

For my part, I’m not sure that media ecology, as a meta-discipline, has evolved at all. There has always been a fluidity, a flexibility, in both perspectives and methodologies, that allows for a very high degree of creative and critical thought. McLuhan’s probes are both philosophy and literature, Ellul blends sociology and theology, Innis looks at communication as an economic activity and technologies as media of exchange, Mary Ann Wolf (although I’m not sure anyone has yet told her she is a media ecologist) gave us our first neurological study of literacy that is, at the same, a deeply philosophical work. Elizabeth Eisenstein’s historical approach influenced the work I did on my first book, Why the Irish Speak English, and even The Metaphysics of Media, while it attempts (and I emphasize the wordattempts) the sort of cultural criticism so masterfully achieved by Jacques Ellul and Neil Postman, is very historical in its approach.

So I see no real evolution in media ecology beyond the “shape shifting” nature that seems to have been deliberately embedded in its fabric. The one thing that must, I think, always define a study we recognize as media ecological is its acknowledgement of the interactions of cultures – and the people who constitute those cultures – and their technologies.

Do you think that media ecology will ever attain the status of a discipline, concerned as it is with invisible media effects and environments – again, a sort of nothingness?

I sincerely hope not. It was only after oral tales became written orthodoxies that some people were labelled “pagans” and “heretics” were burned at the stake for unorthodox views. The greatest strength media ecology possesses is its ability to generate unorthodox views. Media ecology makes a better “Trojan horse” than a golden bull.

What are you currently working on?

I am currently testing the limits of my inadequacy as editor of EME: Explorations in Media Ecology, the journal of the Media Ecology Association. I am working hard on it, but have the unenviable fate of following in the wake of the last editor, Corey Anton, a frighteningly intelligent guy and a fierce workaholic who did a fabulous job with EME over the last few years.

If I manage to survive this experience, I am very close to having prepared for publication what will be either a very lengthy biographical essay or a very brief book about a late-19th/early-20th century Dublin barrister and amateur bibliographer, Ernest Reginald McClintock Dix. Dix represents, I believe, one of the last of the archetypal “men of letters” that McLuhan insisted were a natural by-product of alphabetic literacy.

I also claim to be working on a book, very much indebted to the thought of Christine Nystrom, about the possibility of human extinction as a result of our short-sighted and self-centered technological choices. I started it nearly two years ago and have made very little progress on it in the last year. No, that’s not true. I have made absolutely no progress on it in the last year. But I continue to claim that I am working on it. Perhaps you might ask me again this time next year?

© Excerpts and links may be used, provided that full and clear credit is given to Peter Fallon
and Figure/Ground with appropriate and specific direction to the original content.


Suggested citation:

Ralon, L. (2010). “Interview with Peter Fallon,” Figure/Ground. December 30th.
<   http://figureground.org/interview-with-peter-k-fallon/  >


Questions? Contact Laureano Ralón at ralonlaureano@gmail.com

Print Friendly, PDF & Email