Glenn Kenny’s reprinted an old interview he did with Scorsese when home video was a new thing, and the ensuing comments made me realize that sometimes I still get flashes where home video feels like a miracle. When I was a kid I understood that TV would show Abbot & Costello Meet Frankenstein about once a year, but that was a model of reliability compared to the days when missing, say, The Conformist during its theatrical run might mean having to wait 25 years to get another crack at it. One reason I appreciate old-timey critics like Sarris, beyond anything they ever wrote, is the dedication it would’ve taken to hunt down the most obscure Allan Dwan movie and then create whatever mnemonic devices they had to in order to remember its details because—very probably—they were never going to see it again. That rarity lent a lot of magic to scanning repertory house calendars when they came out because you never knew when some movie you’d been hammering your friends with for years would be on it.
I have to say, though, I’m having a time getting my head around streaming. I like collecting stuff and I also like just staring at shelves of things, be they movies or books or what-have-you. Yeah, I probably don’t need that copy of Wellman’s The Call of the Wild in my closet, but I like knowing it’s there. And when I think of the scores of movies I do feel like I need to own physical copies of, there’s so many of them I may as well go the semi-whole hog, even if physical discs look like a losing technology.
On January 20, 1952, Vittorio De Sica released his masterpiece Umberto D., and God saw that it was good. It still is. Something more than just another “great film”, it’s one of the loftiest peaks in Italian neorealism, the postwar film movement that tried to draw the shortest possible line between movies and everyday life. To this day, watching Umberto D. remains a full-body experience: what’s at stake for its unlikely protagonist is communicated in such clear and concrete terms that we come to register the minutest adjustments in his emotional coloring. But it’s the film’s conclusion, which manages to be both definitively devastating and hypnotically sphinx-like, that concerns me here.
For those who never saw the film, it’s about the retired civil servant Umberto Domenico Ferrari, who lives in a rented room in Rome not long after World War II—a terrible time and place to be alone in the world. Umberto’s station in life has been drifting downwards for some time, we are made to understand. When we meet him, his only friend is a naïve young housemaid who’s saddled with her own problems; he lags so far behind on his rent that his landlady allows hookers to turn tricks in his bed; and the one thing standing between him and a self-administered mercy killing is his constant companion, a personable but utterly dependent terrier named Flike, who’d be doomed without his master.
The film takes place over the handful of days in which Umberto loses his last toehold on life, and his final descent from have-not to have-nothing takes us into situations that we normally see only in nightmares. De Sica spells out in exacting detail just how much work it takes to be poor: whether he’s trying to sell an old watch that nobody wants to buy or scamming a bowlful of food from a soup kitchen for Flike, Umberto is constantly fighting just to reach the next moment in his existence. When at last he’s reduced to pauper status, he has to force himself to extend his hand for alms, making Umberto D. perhaps the only movie to notice how unnatural the act of begging is.
For the most part De Sica and the great Marxist screenwriter Cesare Zavattini successfully avoid sentimentalizing their baggage-laden hero. For starters, they present Umberto as something of an asshole: it’s implied that he’s partly responsible for his predicament, and he’s much better at asking for favors than he is at performing them. (Carlo Battisti, the linguistics professor whom De Sica chose for the part, projects the dour and hissy personality of an unlovable grandparent.) Likewise, De Sica doesn’t overplay the Flike card, mainly by refusing to acknowledge the canine point of view. In the one instance that he slips up—Flike flinches as a human would at the sight of another mutt being abused—we get a glimpse of the different, more ordinary movie that Umberto D. might have been.
Umberto D. observes the daily life of its characters with the intensity of a jeweler’s loupe. A famous scene, played out in something close to real time, merely watches the housemaid go through her morning routine; in one fragrant shot, still in bed and only half-awake, she watches a cat picking its way across the skylight above her head, in one of those mysterious, ineffably right moments of cinema. De Sica pulls so many of these details together that by the end we seem to be inside Umberto’s world; the critic André Bazin put it best when he said that Umberto D. “makes us aware of what it is to be a man. (And also, for that matter, of what it is to be a dog.)”
Near the end of the film Umberto, now out of options, leaves his house for the last time, intent on finding a home for Flike—in effect, clearing the decks for his own suicide—but the world thwarts both this humble effort and, even more appallingly, his subsequent attempt to kill himself.
Now, anyone who’s sat through the film has to concede that man and dog will soon be dead—in a week perhaps, or perhaps in an hour—and yet if you didn’t know better, you’d think that master and pet don’t have a care in the world as they frolic along that pathway. Even the impact of De Sica’s Bicycle Thieves, one of the most celebrated downers in the history of art, is cushioned by our knowledge that at the movie’s end Antonio Ricci still has a home, a wife, and his son’s undying love. Umberto, though, is left facing the abyss, and yet in that final shot he displays a vitality, even a joy, that’s visible in no other part of the movie. How can this be? Acceptance is a virtue, God knows, but when you’re on the bricks like Umberto is, acceptance and two-bits won’t even buy you bubble-gum. Umberto and Flike disappear from view, some rowdy schoolboys sail past the camera, and suddenly it’s time to get on with our lives again. But what exactly did we just see? I’ve discussed the subject with friends and read what wise men have to say on the matter, but the most satisfactory answer came from an entirely different movie.
Jack Arnold’s The Incredible Shrinking Man appeared in 1957, and to this day it remains my favorite ’50s sci-fi flick, largely because of its graceful, enlightened ending. Richard Matheson’s story stars another abrasive hero, Scott Carey, who is enveloped in a radioactive mist that causes his body to begin shrinking—first to laughably childish, and finally atomic, proportions. A lot of Shrinking Man’s entertainment value springs from its parade of Brobdingnagian props: straight-pins that double as spears, a mousetrap the size of a minivan, kitchen matches that look like saguaro cactuses. (One of the movie’s biggest jolts comes in a shot of Carey unexpectedly sitting in a chair that fits him, topped by our realization that he’s moved into a dollhouse.)
The earliest threats to Carey come from such common domestic sources—a sour-faced tabby cat, a burst water heater—that it’s like being terrorized by a ficus tree. The scenes in which he’s reduced to the size of a Ken doll and juxtaposed against his strapping, buxom wife discreetly pick at the male dread of impotence, but like Umberto D. it’s ultimately about what happens to a person when lonesomeness becomes a way of life, and Carey’s description of his existence as “a gray friendless area of space and time” could serve as a tagline for De Sica’s movie. It’s only after he’s vanquished a towering tarantula (it straddles the camera in repulsive close-up) that Carey recognizes a deeper enemy, and realizes that he’s already licked it.
“The infinitesimal and the infinite…this vast majesty of creation…to God there is no zero….” As Jack Benny would put it: Well. But while that language may be a little bit gamey, the typical 1950s sci-fi flick was so intent on easing the age’s anxieties that it felt it had done its job once it dropped an A-bomb on its mole men or leech women and blown them back to kingdom come. For Richard Matheson to actually think through the implications of his original idea was an act of artistic largesse, and the image of Carey stepping off into the cosmos with this micro/macro Möbius strip swirling around inside his head only made it that much more generous. Under a title bad enough to make us wither from sight, Matheson would later concoct a sequel to The Incredible Shrinking Man. In it Scott Carey’s wife begins to shrink, too, and joins him in his tiny adventures until, thanks to exactly the type of miracle which the first movie so forcibly rejected, both Careys return to their normal height and retake their place in the world. It’s as if Matheson had set out to prove that the best endings are the ones which open themselves outward to the largest possibilities.
The Incredible Shrinking Man came into the world five years after Umberto D., and in the decades since the two movies’ fortunes have done a dosey-do. Shrinking Man, an instant hit in ’57, was still playing in crowded theaters when I saw it three or four years later, whereas De Sica’s movie, coming at the tail-end of the neorealist cycle, was a notorious flop in Italy. The Minister of Culture, with one eye glued firmly to the wrong end of the telescope, accused it of national slander while the Italian Communist party rejected its pessimism. Today, of course, Umberto D. is one of cinema’s most hallowed titles while Shrinking Man barely rates as a cult movie.
Say what you will about The Incredible Shrinking Man, it helped me to appreciate Umberto D. on a level beyond trite miracles or easy despair. Once he’s regained Flike’s trust with that pinecone, Umberto has done everything that he needs to do in this world. His bags are packed. And like Scott Carey, he recedes into “the infinitesimal”, an invisible world in which the ties that bind man and beast can never be erased—a place where dogs and men bear the same sized souls, and there are no zeroes.
The problem is actually state-of-the-art Hollywood filmmaking itself, which while in pursuit of relentless video-game-style cool and nonstop action no longer has room or time for ideas or story or character or even other kinds of tasteless sensationalistic impact—the kind that Samuel Fuller, Stanley Kubrick, Verhoeven and Lars Von Trier, for example, have trafficked in without always resorting to chases and punching, chases and punching, and then some shooting.
That’s from Michael Atkinson’s takedown of the Total Recall remake, which I was ready to sign onto without even reading it because of the whole Jesus!-Hollywood-get-some-imagination-already thing, but also because I’m a fair to middling fan of the old Schwarzenegger number. Indeed, I’ve bitched so much—here, there and everywhere—about the lack of “ideas or story or character” in mainstream fare that I don’t really need someone haranguing me on the subject.
The slam-bang relentlessness Atkinson is describing refers to the sensation-centered cinema of Roland Emmerich’s mega-disaster flicks, Ridley Scott’s Gladiator, Nolan’s Batman trilogy, and Paul Greengrass’ Bourne movies—the exact kind of cinema that Pauline Kael once feared would eventually cause audiences to see nothing but “a big hole in the screen” during movies that don’t come with over-the-top action scenes. These movies take as their basic building block the loud, splashy and improbable sequence, as opposed to the old-fashioned story that organically grows out of a single idea, which is roughly the difference between a string of sausages and a living pig. Fans of the style like it because it’s exciting and at its most extreme it provides what they think is a one-on-one correlation between the perils on the screen and their own experience—“It’s like you’re really at war” they’ll tell you, although why anyone would want to experience such a thing is never explained, any more than the difference between sitting in a comfy theater chair and someone firing a machine-gun into your face is ever reconciled.
The style was recently christened with a name—“chaos cinema”—which successfully conveys the idea of a perspective that’s missing a unifying consciousness, and when Tony Scott, a past master of fragmented editing, committed suicide last week, his work was hailed as “a smearing of the senses”, which gets at the same thing. Well, as for me, I don’t get—not even remotely—where the pleasure is to be had in this stuff. The final battle of Seven Samurai also employs a lot of cutting, but only after Kurosawa has so thoroughly grounded us in both the characters of the combatants and the layout of the battleground that not only can we make instant sense of what we’re seeing, we can derive meaning from the action even as it’s happening—meaning that goes far beyond “Oh, he got him right in the head!” I believe the people oohing and aahing over Bourne’s car chases are being sincere when they say they’re having a good time; I just don’t think they’re demanding enough. If the biggest high you get from movies comes from a fireball seen from half a dozen angles, then a stripper humping a silver pole must make you feel like you’ve just gotten laid.
For my money, good action scenes—whether it’s the train robbery in White Heat, the encircling nightmare of Nada’s arrest in Carlos, or a shootout on the velvety streets of a night-darkened town in No Country for Old Men—do a hell of a lot more than throw me into a passive trance. Craving disorientation isn’t just infantile—it’s self-defeating. The touches and details that go into a successful action scene create levels of involvement and satisfaction that go far beyond who’s whacking who. The car chases in Siegel’s The Lineup and Halicki’s Gone in 60 Seconds are thrilling in part because, even amidst the mayhem, we can appreciate their geographical correctness as they zoom across San Francisco and Los Angeles.
The ironic thing is that “chaos cinema” ultimately hails from Sam Peckinpah’s The Wild Bunch, which in 1969 contained an unprecedented amount of multi-angle editing, to the point of setting a record for shot-to-shot edits in a single feature film. (Some 3,200, if memory serves.) But Peckinpah was a classical filmmaker to the bone, and every shot of his massive gunfights was both intensely motivated and carefully fixed within the physical arenas of his action, while his famous intercutting of film shot at different speeds was done with Hitchcockian precision to achieve very particular effects.
And so I’m fundamentally sympathetic to Atkinson’s complaint here. However, he makes a mistake that’s common as dirt when critics lament The Death of Cinema, and it’s all based on some strange misunderstanding people have about videogames. It may be simple prejudice. Once, when I told a pair of friends that I was playing Grand Theft Auto IV, they literally gasped “No!” as if I’d told them that I like strangling kittens in my spare time. It’s no dark or dirty secret, though: I own a PlayStation 3, and I’ve enjoyed the hell out of the half-dozen games I own. And I’m here to tell you, my brothers and sisters, those games—all among the most popular ones on the market—provide an experience which is completely and utterly at odds with the slash-and-stab attack on the senses that Atkinson is talking about. In fact, he has it exactly bass-ackwards. Movies haven’t come to resemble awful videogames; instead, the games—these games, anyway—have done their best to look like good movies.
The games’ cinematic roots can be seen dangling from them in various ways. In GTA IV a bank heist gone awry leads to a reproduction of the (classically staged) street shootout in Heat. Red Dead Redemption owes some of its story and many of its tonal elements—the music, most noticeably—to Unforgiven and Leone’s spaghetti westerns. And as its title indicates, L.A. Noire is the most movie-conscious of them all, with in-game references to a million old crime pictures and a wild foot-chase through the Babylon set from Intolerance.
There’s none of Tony Scott’s whiplash editing style in any of these games, not even for a second. Indeed, that would be impossible, for apart from the cut-scenes—that is, those autonomous little scenes in which the storyline is advanced without the player’s participation or guidance—there’s no real editing at all. Typically you’re viewing the scene through a proscenium-like frame, just as in a movie. Even during the gunfights the action remains framed, continuous and seen from a constant perspective—your own.
Atkinson also complains about the breathless pacing of the modern action movie:
Total Recall is structured in one-second bricks—that’s exactly as long as you get, and not one microinstant more, to let your eye rest on an image, contemplate a character’s feelings, or piece together a narrative sequence’s logic. What movies traditionally basked in now comes at us in strobe-rate splotches…You watch the blip-blip-blip of Total Recall‘s trite ingredients speeding by, and your abandoned craving for context and contemplation and substance—any substance—quickly turns into irritation and then disgusted rage.
In GTA IV and Red Dead Redemption, the player’s surrogate is normally found on foot, and has to be put on horseback or in a car to travel with any velocity at all. If you’d wish to have him walk across the entire “sandbox”—meaning the territory as a whole on which the game is played—from one end to the other, you better be ready for a lot of context and contemplation, because it’s going to take you hours. These game-worlds are each infused with an uncountable number of details that serve as constant enticements to slow down and examine one’s surroundings. Red DeadRedemption, for instance, recreates the topology of the American west—from the plains to the deserts to the snow-capped mountains—while carefully including all the wildlife, weather conditions, and changes in light one would expect to find there.
This encouragement to explore to your heart’s content is the opposite of “chaos cinema”, which holds your attention in a death-grip and never stops directing your gaze. The appropriate cinematic equivalent for videogames, in which “the camera” is perched slightly above and behind the character, isn’t Gladiator at all. It’s the Dardenne brothers’ subjective camera peeking over Rosetta’s shoulder.
In truth, the impersonal, purposeless cutting that’s killing so many action movies today is derived from an art-form that was the whipping boy for everything that was shallow and fast in the ’70s and ’80s: the music video. If you want to blame someone, blame Adrian Lyne and his goddamned Flashdance video. (It’s only fitting: chaos cinema has shredded the musical, too.) That’s the model that has six edits whenever someone tosses a cigarette away, that zooms in and out willy-nilly, and that rejects anything resembling a governing consciousness. The distinction is hardly a milestone in the history of aesthetics, but it’s worth getting right if it’s worth going into at all. Far from corrupting movies, videogames have done their best to replicate the older medium. They’re practically a tribute to it.
The very title of Movie Wars: How Hollywood and the Media Conspire to Limit What Movies We Can See tips us to the best and worst qualities of Jonathan Rosenbaum’s writing. Rosenbaum writes about serious subjects, and for better or for worse he writes about them seriously. When you read Orwell’s or Agee’s journalism, you can feel how, in every wording and punctuation choice, they made it an ongoing effort to write within what they considered to be the limits of their abilities. But writing over his head is something that Rosenbaum never worries about because his strengths rest almost entirely in his content; you don’t go to him for the snark, and nobody comes away from his prose thinking “Man, this shit really sings.” Writing is a job of work to him, and he’s uninterested in duplicating Kael’s ability to replicate in words an actor’s physical gesture, or Hoberman’s dashing historical deductions, or Farber’s ability to stand the language on its head; he’s got all the language, plain though it might be, that he needs to express his ideas. This refusal to spruce himself stylistically mirrors his Amish refusal to groom his physical image, and perhaps make himself, if not more telegenic, then just a little less weird, a little more presentable to the masses.
Here, by the way, is what a film critic is supposed to look like:
There just aren’t a whole lot of zingers in your typical Jonathan Rosenbaum essay, and this, as much as his refusal to shill for the studios, has put a cap on his fan-base over the years. And since nobody likes being told that they’ve been bamboozled, when Rosenbaum commits all of these crimes and then throws a black-chopper buzzword like “conspire” into the title of his book, it’s like he’s begging to be jeered at. [See the comments for this post for an important clarification of this point.]
I believe Rosenbaum is aware of all this, and that early on he consciously decided to take the plunge and put down what he thinks, in exactly the language it comes to him in, and let the chips fall where they may. I understand the decision (if indeed he made it), for leavening his style to increase his readership would put him on the same path as the eminently readable, consummately empty Anthony Lane. But film criticism that wants to improve the climate for making good movies is suffering from the same pinch that our political progressives have been suffering from for years: the need for an American idiom which doesn’t sound like cant, and whose logic will appeal to “regular people” (however you define them) and make them want better movies, too. So built-in is our resistance to earnestness that even the baldest description of the situation gets people like Rosenbaum labeled a Chicken Little, often by folks who in a slightly different setting might happily agree with his analysis.
Personally, I’m happy to have a Jonathan Rosenbaum around and saying the things he does; even when he says them in a counterproductive way, it’s better than having nobody say them at all. Since the videotapes of Robert Redford, Harvey Weinstein and Arthur Sulzberger, Jr. sharing a steam-bath and agreeing to suppress foreign and experimental films are apparently never going to see the light of day, much of the evidence for Rosenbaum’s case is necessarily anecdotal. But there are an awful lot of anecdotes to recount, to the point that the bottom-line truth of his argument seems irrefutable. What makes less sense to me are the people, many of them committed cinephiles, who respond to these arguments by freaking the fuck out about them with a faux sophistication masking itself as either cynicism (“It’s only a movie!”) or condescension (“Of course, the studios are jacking us around! Grow up!”). These denials sound just like the push-back of a Fox News anchor when a liberal gets too mouthy on his program—their only intention is to shut the conversation down, ASAP. Since many of the people who don’t want to engage with Rosenbaum’s argument (or its corollaries) are themselves liberals who in the past have been poked by conservatives with the same rhetorical stick, it’s—well, entertaining, I guess is one word for it—when they act like Bill O’Reilly and fight for a status quo which they know to be pathetic.
I’ve got some ideas why people respond to criticism of mainstream entertainment with such vehemence, but that’ll have to wait for my upcoming post entitled I Want My MAYPO! Last night I came across the chapter in Movie Wars dealing with the American Film Institute and its initial list (from 1998) of the “100 Best American Movies”, and it reminded me of just how many people, far from thinking about Rosenbaum’s position and pondering the evidence in favor of it, instinctively side with the suits and their lame-ass product the same way struggling members of the middle-class inexplicably identify with Wall Street investment bankers. It also reminded me of this thread from Salon’s Table Talk message board, just after the AFI list was announced. It preceded Rosenbaum’s book by a couple of years at least, but covers most of his major objections to the exercise, and it’s still a good, lively read, notable for two things in particular. One is that rarest of rarities, the Internet poster who having vociferously aligned himself with one position, does a little research, decides he was wrong, and then not only reverses himself, but does it so all can see. Remarkable.
But I was even more fascinated by the contributions of the NYU journalism professor Jay Rosen. The fact that he was a newcomer to Table Talk’s film discussions didn’t keep Rosen from trying to turn the thread into one of his seminars, with Herr Doktor tossing out “stimulating” thinking points for his captive students to mull over. As you can see, his presumption was actually exceeded by the bubbleheaded defenses he mounted on behalf of the AFI’s little PR gimmick. With his rhetorical questions handed down from Parnassus and shameless goalpost shuffling, he was like the anti-Jonathan Rosenbaum, and to this day I wonder what the miserable wienie hoped to accomplish. If he wasn’t a ringer for the AFI, he sure did a great impression of one; for the record, his game hits rock-bottom here.
A critical trait I can see the theoretical value of but find basically impossible to adopt: ignoring some major negative within a movie in favor of focusing on what the movie is doing, or trying to do. On his blog Rosenbaum just reprinted his essay about the director Cy Endfield (probably best known for Zulu), and he bases a lot of his case on Endfield’s Try and Get Me!, a low-rent 1950 noir about a family man who befriends a stranger, yadda yadda, it doesn’t really matter here. It’s a mostly negligible film until its last half hour, when a new movie comes bursting out of the old movie’s chest, and we’re treated to a ferocious lynch-mob scene that actually stands up well to the riot in Lang’s Fury. Rosenbaum’s right when he says that the movie’s directed with some brains and care, but he fails to mention one unavoidable problem: its star, Frank Lovejoy, gives a performance that’s neither lovely nor joyous. In fact, it’s textbook awful. The charisma-free Lovejoy suffocates every scene, dragging their rhythms down, or backwards, with his pained attempts to look desperate, and it’s especially ruinous since his character is a man who’s supposedly being flushed away by momentum and circumstance. (Lovejoy wasn’t bad as Bogart’s cop buddy in In a Lonely Place, but that was a small role and in 1950 you pretty much had to defy God himself to look bad in a Nick Ray movie.) And yet Rosenbaum declares Try and Get Me! to be “a masterpiece of the early 50s”.
Well…hm. That’s a pretty tough sell for those of us who think the word “masterpiece” ought to be reserved for a few ambitious and accomplished works whose flaws, if any, amount to nothing more than hairline fractures. It’s also a tough sell to those of us who see movies as an ultimately collaborative effort, who can’t forget that watching them is something we have to experience physically, who feel that a director’s job includes coaxing good work out of his performers as well as his crew, and who—if nothing else—notice how much better movies work when actors actually enhance the material rather than stand around stinking the place up. I suppose I could appreciate Transformers 2 as a Platonic form that’s just hovering in the ether, rather than the bag of shit it really is, if I closed my eyes and let myself float up to some majestically neutral plane, high above all those critics who rely on their precious little value judgments when they think about movies. But I just can’t help thinking that it’s important to stay in touch with the film that actually appears on the screen, and that just as much can be learned from what’s wrong in a work as from what’s right in it.
But the arguments for an impersonal appreciation of art take a lot of forms. More times than I care to count I’ve gotten into fights over how relevant an artist’s intentions are (or even how interesting they are)—a surprisingly contentious point for some folks, who seem to think that even to consider what Renoir had to say about Grand Illusion is at best uncritical star-fucking and at worst a mental surrender to some kind of fascistic paternalism. Well, part of the reason I seek out a Suttree or Crime and Punishment to begin with is because I like spending time with the weird febrile goofy fucks that create them; hooking into a work gives us what’s almost certainly going to be the only contact we ever have with an Agnes Varda or J.G. Ballard.
This shit made me think of Pauline Kael’s line “In the arts, the critic is the only independent source of information. The rest is advertising.” When Thomas Doherty published his article “The Death of Film Criticism” a few months back, people came spilling out of the movie blogosphere’s woodwork to point out how deeply dopey it was in light of all the talented film crit available in different media today. (Now, professional movie reviewing can be seen suffering, but that’s a distinction which Doherty never makes.) The ensuing pile-on was to be expected, partly because bloggers like to flail away at things, but mainly because Doherty’s argument was insane—not only is film criticism not doing badly, it’s healthier than it’s ever been. It didn’t help either that Doherty made his case out of both corners of his mouth: read it closely and you’ll see he never says that film criticism is done for—the jury’s still out, I guess—but that other people say so, and here’s a little (though not much) evidence to back them up, with a closing line to the effect of “It’d be a shame if it’s true, wahh.” Obviously he wrote it that way to cover himself from the shitstorm that he knew must arrive, which only makes me wonder why someone who can anticipate both the content and the venom of the rebuttals headed his way would continue plodding ahead with his thesis. Of course he’s getting paid, but presumably he would’ve gotten paid just as well for writing something that isn’t demonstrably false and which wouldn’t generate personal attacks against himself and his descendants lo unto the seventh generation, all of which leads me to the conclusion that Thomas Doherty is both a fool and a masochist.
Anyway, Glenn Kenny (who, incidentally, thinks Shutter Island rates being mentioned in the same breath with Vertigo, and, I’m sorry, that too is simply incorrect) and a host of other bloggers, instead of just dismissing Doherty’s article with a “Hmph! What a conspicuously fallible argument the man makes!,” instead chose to go full-monty ape-shit about it. More to the point, none of them turned the discourse towards what still seems a very logical progression of such a conversation, to wit, how all the movie love that’s out there can be translated into getting better movies made. One way, I’d suggest, would be not to supply free merchandising ad copy for the studios in the New York Times, and another would be to not gas up empty garbage like Shutter Island. In fact, Scorsese is sort of ground-zero for what I’m talking about. In My Voyage to Italy and his DVD commentaries, the man talks the talk like no man alive: he understands exactly how movies work and what makes them special, and his taste is nearly impeccable. And yet if you shove a camera into his hands, he comes back with a 130-minute episode of The Twilight Zone.
I did find one good thing thanks to Shutter Island, though:
Forty years ago thinking people had to pick sides with Kael or Sarris in the great auteur debate; today we get to pick between Richard Schickel and Harry Knowles, a pair of vainglorious empire-builders who manage to be repulsive in totally different ways. The Schickel quotes in that article are pretty staggering for someone who does nothing to discourage the popular perception that he’s a film authority, and even in the one instance where he’s right, he’s right for the wrong reasons: Harry Knowles is, indeed, “a gross human being,” but it has fuck-all to do with his appearance. If you’re ever looking for a vivid contrast in critical styles, just listen to the DVD commentaries for the three spaghetti westerns that Clint Eastwood made with Sergio Leone. The terse, insightful commentaries by Leone scholar Christopher Frayling on the first two movies are those of someone who loves certain films and has spent some time thinking about them, while Schickel on The Good, the Bad & the Ugly—certainly the plum assignment of the trilogy—is content to provide a lazy recap of what your eyes are already relaying to your brain, which is about what you’d expect from someone who mistakes Clint Eastwood for a great filmmaker.
I have to admit being tickled to hear that Paul Schrader also had to realize that he was living through a “historical aberration.” I eventually reached the same conclusion but with the handicaps of being almost a decade younger than Schrader, knowing next to nothing about Hollywood history, and never spending so much as a day of my life in show business. I turned 18 in 1972, and, as Ray Liotta put it in GoodFellas, “It was a glorious time.” The years 1970 to 1974 saw a bumper crop of films that were “great” in the canonical sense, but which worked even better as cinematic off-road vehicles: when the lights went down you couldn’t be sure where you’d come out. (Straw Dogs and A Clockwork Orange were released as Christmas movies, for crying out loud.)
The thing is, I didn’t realize it was glorious. When you’re 18 and just coming out of the Sixties, it’s easy to misconstrue situations like that as the natural order of things, and it’s only as the space once taken up by movies like Mean Streets and Badlands is consumed by Kramer vs. Kramer and Rocky sequels that you notice something funny, but not ha-ha funny, is going on. People like to blame Star Wars for dropping the curtain on the renaissance, but the truth is things weren’t ever going to stay that way, with Robert Altman literally getting his dreams greenlighted, and Coppola and Cimino breaking entire studios (or themselves) on the wheel of their follies.
What didn’t have to happen, though, were all the ugly things that did happen…