MLA 2020 Panel: Bad Books

At this year’s Modern Language Association Convention in Seattle (January 9-12, 2020), I will be speaking on a round table discussing Bad Books. I have included the information about the panel and a tentative abstract for the paper I will be presenting below.

 

338. Bad Books

Friday, January 10, 2020, 1:45-3:00 pm, 617 (WSCC)

Presiding: Eric Loy

Presentations:
1. “Notes on Notes on Notes: Glenn Ligon Reads James Baldwin,” Paul Benzon (Skidmore C)
2. “Books Behaving Badly: The Raison d’Être behind Perec’s La Disparition,” Priya Wadhera (Adelphi U)
3. “Debilitated Forms and Forms of Debility: On Writing a Failed Book,” Sharon Tran (U of Maryland Baltimore County)
4. “The Space of Megatexts: ‘Reading’ Mark Leach’s Marienbad My Love,” Bradley J. Fest (Hartwick C)

 

The Space of Megatexts: “Reading” Mark Leach’s Marienbad My Love

At over seventeen million words and consisting of seventeen volumes printed in dense eight-point font, the second edition of Mark Leach’s Marienbad My Love (2008; 2nd ed., 2013) currently holds the record as the world’s longest novel and is what I have elsewhere called a megatext. Composed over the course of thirty years using a number of digital techniques, the result is one of the more spatially imposing works of literature to ever sit on a shelf. Because of this, it also appears that no one has really bothered to read it. Whether this is due to some prejudice against self-publication or critics’ perceptions of authorial vanity, the sheer unreadable size of the text has discouraged anyone from taking Leach’s work all that seriously. I believe this is a mistake and this paper aims to seriously consider a remarkable project that rebelliously pushes against the conceptual, temporal, and physical boundaries of the codex novel. The revisions made to the second edition of the text indicate that not only does Leach intend for people to actually read his book, but also that Marienbad My Love is in fact a complex theoretical statement about the novel in the digital age and a meditation on the present and future of literary writing. In this paper, I will argue that accounting for Marienbad My Love’s material size by finding ways to speculatively (and actually) read this unreadable text will encourage us to rethink how we theorize the novel in the twenty-first century.

 

For previous essays of mine on megatexts and unreadable texts, see:

“Toward a Theory of the Megatext: Speculative Criticism and Richard Grossman’s ‘Breeze Avenue Working Paper.'”

“Reading Now and Again: Hyperarchivalism and Democracy in Ranjan Ghosh and J. Hillis Miller’s Thinking Literature across Continents.”

“Writing Briefly about Really Big Things.”

“The Megatext and Neoliberalism.”

“The Time of Megatexts: Dark Accumulation and Mark Z. Danielewski’s The Familiar.”

Keyword Seminar on Length at the 2018 Society for Novel Studies Conference

I will be leading a keyword seminar on length at the 2018 Society for Novel Studies Conference, May 31-June 2 at Cornell University. I have included a description of the seminar and the names of the other presenters below. Other keyword seminars can be found here.


Keyword Seminar on Length at 2018 Society for Novel Studies Conference

Bradley J. Fest with Alex Creighton, Alley Edlebi, Andrew FergusonJason Potts, Robert Ryan, and Aaron Vieth

Description

From multi-season serial television, to cinematic universes, to immense videogames, narratives across media appear to have gotten longer in the digital age. Can the same be said of the novel? On the one hand, authors have written lengthy novels throughout the form’s history. On the other, the issue of novelistic length seems newly pressing now that digital technologies have given writers the capacity to author books that are unreadably massive (e.g., Richard Grossman’s forthcoming three-million-page Breeze Avenue or Mark Leach’s seventeen-million-word Marienbad My Love). This seminar invites its participants to take up questions about length with regard to the role and status of the novel historically and at present. How does the history of print narrative influence how we think about novel length in the twenty-first century? Are there upper and lower limits to how long a novel can be (and why would such limits matter)? What is the relationship between the novel and other transmedia meganarratives? What is the legacy of the twentieth century’s “big, ambitious novel”? And, going forward, how do scholars study print and digital texts that are too big to read?

“The Megatext and Neoliberalism” and “Metaproceduralism: The Stanley Parable and the Legacies of Postmodern Metafiction”

I’ll be giving two talks in Pittsburgh over the next two months on May 13 and June 22, 2016.

 

1. Friday, May 13, 2016 — 2:30 – 4:30. Part of a panel on “The Novel in or against Neoliberalism” at the 2016 Studies in the Novel Conference, The Novel in or against World Literature, Wyndham University Center – Oakland Room II.

Chair: Jen Fleissner, Indiana University

“The Megatext and Neoliberalism,” Bradley J. Fest, University of Pittsburgh

“The Novel in India and Neoliberalism,” H. Kalpana, Pondicherry University

“The Novel and Neoliberal Empathy,” Alissa G. Karl, The College at Brockport-SUNY

“Immanent Value in The Golden Bowl,” Paul Stasi, University at Albany-SUNY

 

The Megatext and Neoliberalism

With the steadily increasing storage capacity and processing power of contemporary information technology, enormously large texts are beginning to emerge that rival the books and libraries once imagined by Jorge Luis Borges. For instance, at some point in the near future, poet and novelist Richard Grossman will install Breeze Avenue—a five thousand volume, three million page “novel”—as a reading room in Los Angeles, and will also make this text available online in a fluid version that will change roughly every seven minutes for a century. Grossman’s text is, quite simply, too big to read; it is a megatext. This paper will consider the appearance of the unreadably massive novel as an emergent form native to the neoliberal era.

The writing, publication, and distribution of megatexts are impossible without the informatic, technological, and economic transformations of neoliberal globalization. For instance, the composition of Breeze Avenue would be inconceivable without big data and algorithmically generated text, without significant funding and personal wealth (Grossman was a high-level executive for a multinational financial firm in the 1970s), and without transforming the labor of the author from writing to managing. Mark Z. Danielewski’s twenty-seven volume meganovel-in-progress, The Familiar (2015-    ), takes full advantage of contemporary digital composition and production to create a work deeply enmeshed in the digital present by self-reflexively remediating the new media forms made possible by the distributed networks and posthuman technologies of the twenty-first century—including electronic literature, premier serial television, social media, videogames, and YouTube. And Mark Leach’s seventeen volume, ten thousand page, open source, digitally generated meganovel, Marienbad My Love (2008), takes advantage of crowd-sourced, collective authorship, reflecting the always-on unpaid digital microlabor that has come to characterize work in the overdeveloped world. Understanding such texts as unique outgrowths of and important critical reflections upon the age of neoliberalism allows us to explore important questions about the role of the novel in the twenty-first century and the possibilities for responding to the nonhuman logics of contemporaneity.

 

 

2. Wednesday, June 22, 2016, 1:30 – 3:00, I’ve organized a panel on “Videogame Adaptation” with Jedd Hakimi and Kevin M. Flanagan, colleagues in the Film Studies Program at the University of Pittsburgh, for the Keystone DH 2016 Conference, Hillman Library, University of Pittsburgh. (A schedule of the conference.)

 

Videogame Adaptation

baby

As videogames continue to emerge as a dominant twenty-first-century form, it is becoming clearer that they have complex relationships to other media. This panel, part of a larger collaborative project, will address issues of adaptation and videogames from a transmedia perspective, drawing particularly on the resources provided by film and literary studies.

 

Videogame Adaptation: Some Experiments in Method
Kevin M. Flanagan, University of Pittsburgh

This paper outlines the concerns and conceptual practices of videogame adaptation, noting the many ways in which videogames shape, or are shaped by, ideas, narratives, and mechanics from other media. In situating videogames into the discourses of textual transformation that animate current work in adaptation studies, I argue that traditional approaches to adaptation in English departments (which privilege novel-to-film adaptation in a one-to-one correspondence) have a lot to learn from games, which function as adaptations at all stages of their production and consumption. I also demonstrate how adaptation studies challenges claims to medium specificity that form a foundational conceit of videogame studies.

 

Metaproceduralism: The Stanley Parable and the Legacies of Postmodern Metafiction
Bradley J. Fest, University of Pittsburgh

Most critics of contemporary literature have reached a consensus that what was once called “postmodernism” is over and that its signature modes—metafiction and irony—are on the wane. This is not the case, however, with videogames. In recent years, a number of self-reflexive games have appeared, exemplified by Davey Wreden’s The Stanley Parable (2013), an ironic game about games. When self-awareness migrates form print to screen, however, something happens. If metafiction can be characterized by how it draws attention to language and representation, this paper will argue how self-reflexivity in videogames is best understood in terms of action and procedure, as metaproceduralism.

 

Playing Los Angeles Itself: Experiencing the Digital Documentary Environment in LA Noire
Jedd Hakimi, University of Pittsburgh

Almost everything about the predominantly faithful depiction of 1947 Los Angeles in the recent, police-procedural videogame LA Noire (2011) was based on archival material, including period maps, photography, and film footage. And while scholars have thought extensively about how film spectators experience mediated depictions of real-world cities, the videogame player’s parallel experience has been relatively unexplored. Accordingly, I take LA Noire’s simulacrum as an opportunity to reflect on what happens when a real-world environment is adapted into the setting for a videogame. Specifically, I position LA Noire in the tradition of the “city-symphony” film and a particular sub-set of Film Noir known as the “semi-documentary” to make the case LA Noire contains crucial aspects of the documentary image. Consequently, LA Noire is not so much creating a fictional, diegetic world, as it is presenting our own world back to us in a manner that changes the way we experience the world in which we live.