It’s one of the most iconic endings in American cinema: battered, punch-drunk Rocky Balboa (Sylvester Stallone), having “gone the distance” against heavyweight champ Apollo Creed (Carl Weathers) calls out to his mousy girlfriend, Adrian (Talia Shire). Adrian pushes through the crowd, the two embrace, profess their love to one another to riotous applause. A sweet, tearful finale to Rocky’s story of against-all-odds underdogism. But it wasn’t meant to be that way.
As Stallone told the Los Angeles Times in 2016, Rocky was originally meant to crowd-surf across the audience to Adrian. But, hamstrung by a modest budget and a lack of extras, it wasn’t convincing. So Stallone and director John G. Avildsen reshot the sequence, making it more personal, strengthening the romance between Rocky and Adrian as the film’s real emotional center.
The “Rocky” ending makes a strong case for the validity of reshoots — budget constraints drew the story’s love interests closer together, in a way that ultimately benefited the film. Many certified Hollywood classics were also subjected to extensive reshoots: George Cukor and King Vidor filmed a few scenes for “The Wizard of Oz,” swapped in like widgets in the new studio machinery that was just creaking to life. "Gone with the Wind" faced somewhat the same treatment.
Today, movies are faced with a contrary problem. With franchises increasingly defined by a uniform, stylized sameness and budgets stretching toward infinity, reshoots are becoming more and more common. As in those early days of blockbuster filmmaking, a revolving door of directors is becoming the norm.
Just this past week Variety reported that Warner Bros. is investing $25 million to reshoot large sequences of its forthcoming superhero franchise team-up “Justice League” after Joss Whedon flew in to replace departing director Zack Snyder. It's a situation that's becoming the norm in big-budget, blockbuster filmmaking. Both of Disney’s “Star Wars Story” pictures (last year’s “Rogue One” and the forthcoming Han Solo spin-off/prequel whatsit) ditched their slightly more esoteric, indie-certified directors. Studios are now regularly figuring in big money for additional photography, operating from the presumption that reshoots will be necessary.
With such practices becoming the industry standard, a nagging problem presents itself: Who makes the movies?
There are few ideas in pop culture as abused as the auteur theory. Developed in a 1954 essay by French critic/filmmaker François Truffaut as “la politique des Auteurs,” auteur theory held that the director was the guiding artistic intelligence of a given film. As refined by the American critic Andrew Sarris in the early 1960s, the auteur theory valued, firstly, a given filmmaker’s technical competence (“a great director has to at least be a good director”) and, further, a “distinguishable personality” of the director, which expressed an “interior meaning” that was “extrapolated from the tension between a director’s personality and his material” (emphasis added).
This tension feels key; cinematic-artistic personalities emerged because of the systematic constraints, and collaborative nature, of filmmaking. The signature of an auteur is more like a faint palimpsest visible underneath a film’s action, editing, mise en scène, etc., as opposed to a bold, conspicuous graffiti tag.
Over time, “auteur” has been used (incorrectly) to refer to anyone who passably qualifies as a film artist. David Lynch is an auteur. Lena Dunham is an auteur. The Coen brothers comprise a single authorial voice and, as such, are a kind of mutated, two-headed auteur.
While this newfangled definition patently undercuts the original iteration of auteurism — which canonized film artists whose work revealed consistent thematic preoccupations not because they were granted carte blanche artistic freedom, but precisely because they were hemmed in by the assembly-line processing of studio filmmaking -- it’s sort of understandable. “Auteur” is fun, fancy-sounding French word. By mere virtue of this fanciness and its French-ness, it confers a kind of artistic legitimacy — just as it might, say, if Wendy’s were to refer to its production line of burger flippers and Frosty-makers as its “chefs de cuisine.”
However, in the contemporary studio system, most any iteration of auteurism feels like bunk. Modestly interesting directors with sufficient indie-nerd cachet, like Colin Trevorrow (“Jurassic World”) or Joss Whedon (“The Avengers,” and now “Justice League”) or Edgar Wright (initially retained for Marvel’s “Ant-Man,” later replaced) are tapped for major studio films. But instead of whatever stylistic peccadilloes they possess peeking through, they are flattened by the steamrolling logic of the franchise. And if they cannot be auteurs, then those directors are summarily canned. The authorial signature is smudged, if not entirely overwritten. (Perhaps we’ll one day be treated to the “Edgar Wright Version” of “Ant-Man,” comparable to the “Richard Donner Cut” of “Superman II” that surfaced long after the original director was swapped out for Richard Lester.)
Elsewhere, plenty of indie/studio-indie/foreign films are so clearly the product of original — and perhaps singular — creative visions that calling them auteurist in the traditional sense doesn’t really make a whole lot of sense. (Like, yes, Kathryn Bigelow or Tsai Ming-liang or Martin McDonagh or whomever make recognizable, even personal, films; but this is so obviously the case that calling them auteurs does little beyond vouchsafing them that Francophilic sheen of artistic worthiness.)
As “Rogue One,” “Justice League” and the Han Solo movie suggest, directorial talent is disposable, to the point of seeming wholly irrelevant. The function of the director in the Hollywood franchise film has been reduced to that of (to nip a favorite Sarris-ism) “the quasi-chimpanzee.” The less openly derogatory term, from Truffaut, would be metteur en scene; they are scene-setters, not authors or artists.
With not only auteurs but even functional directors rendered expendable, the question remains: What constitutes the guiding intelligence behind a given film?
Even in its heyday from the 1950s to 1970s, the auteur theory did not go unchallenged. Pauline Kael quarreled with Andrew Sarris across various magazines, arguing that the medium’s inherently collaborative nature precluded any form of singular authorship. A 2006 book by David Morris Kipen argued that screenwriters, and not directors or actors, were the surest guarantor of a film’s quality (he called this the”Schreiber theory,” exchanging a fancy Franco loanword for a sterner, more Teutonic one). But neither of these theories scan as suitable replacements for the auteur’s primacy.
It would be too easy to claim that the Hollywood studios, and the bankers operating them, are the new lords of the franchise machinery (though the 2014 Sony Pictures leaks make a strong case). It’s also not entirely accurate. It’s more that franchise studio films — the big-budget blockbusters about dour superheroes and saintly CGI apes and whatnot — develop their own deviant logic. In their jigsaw interlocking of characters, scenarios and strings of sequels, such films establish a consistent (or just unrecognizable) aesthetic, itself determined by the success of other films in the franchise.
A big factor for all the “Justice League” additional photography, for example, concerns tying the film more closely to the highly successful “Wonder Woman” movie. Audiences responded more positively to “Wonder Woman” than DC’s other, more glum franchise entries. And, therefore, “Justice League” is being tweaked to seem more like “Wonder Woman.” Similarly, people responded to the grimness of “Dawn of the Planet of the Apes,” and so the new “War of the Planet of the Apes” is rendered bleak to the verge of self-parody.
The responses of audiences, the box office and (to the least extent) critics commingle to compose the new blockbusters, in lieu of any one director exuding competing authorial force. The theory of collaborative authorship now extends beyond a given film set and into the cinemas themselves, drawing in fickle patterns of reception and appreciation.
It might be misconstrued as utopian: the whole filmgoing community coming together to set the tone of the current cinema. But in effect the result is, to use a phrase deployed by The New Yorker’s Richard Brody in a 2012 obituary for Sarris, a new kind of “populist demagogy.” Left unchallenged, popular prejudices reinforce themselves.
End result, modern cinema isn’t so much collaboratively authored as totally unauthored, lacking signature or sensibility, executing itself automatically, like those machine-learning Twitter bots that quickly devolve into shameless trolling. It’s creativity by unwilling committee, in which accidental apparatchiks help set the boilerplate, exercising their hazy influence like old-school auteurs — or like an infinite number of quasi-chimpanzees toiling away in dark.
Shares