Roger Ebert

Synecdoche, New York – Part I of Analysis

Synecdoche (pronounced /sɪˈnɛkdəkiː/; from Greek synekdoche (συνεκδοχή), meaning “simultaneous understanding”) is a figure of speech[1]in which a term is used in one of the following ways:

  • Part of something is used to refer to the whole thing (Pars pro toto), or
  • A thing (a “whole”) is used to refer to part of it (Totum pro parte), or
  • A specific class of thing is used to refer to a larger, more general class, or
  • A general class of thing is used to refer to a smaller, more specific class, or
  • A material is used to refer to an object composed of that material, or
  • A container is used to refer to its contents.
– From Wikipedia

I had the fortune of watching Charlie Kaufman’s Synecdoche, New York over the weekend, a viewing long overdue since its theatrical debut in 2008. Having seen how polarized and divided critics were on Kaufman’s vision – from enthusiastic praise to scathing scorn – I was curious to see why exactly one of my favorite writers could possibly enthrall and enrage critics all around. So after finishing Synecdoche, New York, I definitely saw why Roger Ebert considered it one of the films to be studied in film classes for years to come, simply because it’s that kind of movie.

For those unfamiliar with the film: Caden Cotard (Phillip Seymour Hoffman), a skilled theatre director, realizes he is slowly dying from a mysterious autoimmune disease, and hits rock bottom when his wife Adele (Catherine Keener) takes their daughter Olive and leaves to start a new life in Berlin, away from the sullen and seemingly oppressive atmosphere of their home in New York City. Unexpectedly, Caden receives a MacArthur Fellowship, allotting him money so he can explore and endeavor upon his own artistic ideas. With this, he gathers an ensemble cast into a warehouse in the Manhattan theatre district, directing them to create the greatest, most revolutionary play of all – a look into the cold, unspectacular aspects of real life.

After the credits rolled on in, I sat at my desk for a few moments to take in what I’d just experienced: a maddening tale of one man’s delirium and coping mechanism with death; a look into the obsession of the creative process; the odd, ungainly and inexplicable visual detail that intentionally stuck out like a sore thumb the entire story course; sudden leaps in chronology that could make Kurt Vonnegut pause for a few moments; or perhaps even a sad portrait of the sad life of a genius, and much more. Synecdoche, New York is that kind of movie – the one that takes more than one viewing to see all of its nuances, perhaps faulty editing and all.

Having given the film some adequate (but certainly not enough) musing, I thought of this: in Caden’s obsession to replicate every aspect of his life into the ultimate replica play, he effectively becomes the theatrical master of hindsight – a feat not too dissimilar to documentaries, photojournalism, or even reality television shows.

Hindsight is one of the most dastardly aspects we could ever hope to indulge in. “I should’ve, would’ve, could’ve, why didn’t I, why did I…” – the infinite possibilities could you drive you mad with regret if you don’t learn something from past mistakes to change your course of action for the future. In Caden’s case, his entire life is one of regrets: before Adele leaves him, she comments that he is a disappointment, invariably setting off a chain of events which drive Caden to constantly look at hindsight, to continuously reevaluate his past actions in order to feel worthy in Adele’s shadow – a feat he never personally accomplishes until the very end. By constructing the ultimate reality play – from buildings to people playing people playing people – Caden attempts to explore the mundane aspects of his life that have already happened, almost a therapeutic retrospect project so that he can understand why everything in his life seems to be falling apart slowly and surely.

Caden’s efforts are not so different than the nature of a reality show, albeit on a grander and monumental scale. Like Caden’s magnum opus, reality TV shows are always after the fact, a look into events that have happened only months before. Edited for the sake of marketability, these shows are deeply personal to the players involved, only to be broadcasted hereafter to a greater, wider audience. The only saving grace between the viewer and the person on screen is the television screen itself, and the passage of time between the initial filming and eventual broadcast.

For Caden, however, there is almost no barrier between reality and hindsight, a product of his personal obsession to make his play absolutely perfect and unequivocally unspectacular. This minimalistic (if nonexistent) barrier eventually drives the actors to depression, perhaps madness, and death – a symptom of reality and hindsight becoming broadcasted too close to one another.

The question now is whether or not Caden successfully breaks closer to reality than any other artist before him, or if he simply dropped into the abysmal obsession of recreating and replica crafting – that is, whether or not Caden taps into the reality of human nature with his magnum opus.

Perhaps the first question we must consider is what the nature of human is. For instance, is it so far-fetched to consider that perhaps on some level, documentation dilutes events already past? And to what extent of documentation and publishing/broadcasting/performance does the portrayal become less adherent to the reality that once was? More importantly, through whose lens are we considering the events taking place, and to what extent is this lens subjective?

What we can say about Caden and his synecdoche of New York City is that deep down, he is a man who simply wants to be loved. He has made choices in life that resulted in Adele’s ultimate rejection, and his visionary play becomes almost like his last hope of ever feeling self-worth in Adele’s eyes. The remainder of his life is a constant catch up chase, a mistake-correcting cycle that revolves solely around his desire to create something undeniably perfect from all perspectives, and the inevitability that death and time effectively neuter his last living years of artistic obsession, and that he will never, ever find closure with Adele.

Analysis to be continued…

Recommended Reading

The best films of the decade - Roger Ebert

O, Synecdoche, my Synecdoche! – Roger Ebert

The Chuck Klosterman Interview Part 2: 30 Rock, Mad Men, The Office, Arrested Development, and Why Movies and TV have made us less human – Hunter Stephenson of /Film

Time of Eve (イヴの時間) - A Exploration of Our Humanity

In lieu of my discussion on “Ghosting,” a few weeks ago Allan Estrella recommended Time of Eve, commenting that the story was exceptional in exploring human behavior with respect to artificial beings – specifically robots and androids, or artificial “ghosts." 

The premise is this: in the (likely) future of Japan, androids have become as commercial as the cell phone and laptop. However, in order to maintain traditional social structure, humans and androids are discouraged from interaction beyond basic controls and commands, and androids are required to always maintain a halo-like projection above their heads so they may not be mistaken as humans. 

The main character, Rikuo, has taken robots for granted his entire life. One day, he discovers that his family’s home android, Sammy, has begun acting independently, and with his friend Masaki traces her movements to a cafe called "Time of Eve,” where any patron – android or human – is welcomed, and no one is discriminated against. 

From there on out, the story explores different vignettes of characters, from the hyperactive Akiko to the lovers Koji and Rina. The main conflict, of course, is how humancentric behavior arises in lieu of an intelligent, artificial being created by humans, and how such fears, prejudices, and pride can make us as inhuman as the androids we make out to be. In Time of Eve, humans commonly treat androids subserviently, coldly ordering them about without a single glance. Social stigma additionally deters people from acting kindly, graciously or gratefully to androids: the mere act of holding an umbrella over an android will get others pointing and laughing at you, derogatively labeling you as a dori-kei (“android holic”). Such behavior is encouraged non-governmental organization, the Robot Ethics Committee, which advocates segregation between humans and robots and the government to enforce such. 

At the heart of this conflict is one of emotional legitimacy: given that robots and androids are cognitively capable (if not more than humans regarding information processing) due to their code and algorithmic coding (and are thus self-learning, perhaps to an extent), does this mean they are capable of emotional display and reception?; and if so, should we consider such as legitimate? 

First, let’s consider living organisms, more particularly the vertebrates (reptiles, birds, mammals). Animals, while possibly exhibiting physical features or behavior similar to humans (Chimpanzees, for example), are not us: we cannot interbreed viable offspring with non-Homo sapiens, yet there is a tendency for animal lovers to anthropomorphize certain aspects of animals we observe (I’m particularly fond of Oxboxer’s description of cheetah cubs: “They look like the kid you hated in preschool because he got light-up sneakers three months before they were even being sold in the States, and lorded it over everyone until your friend colored his hair green with a marker during nap time.”) This is especially true for household pets, and lends us to distress whenever they pass away. Understandably, our tendency to become emotionally attached to animals is not unusual: their behaviors are invariably tied to their emotions, and while we cannot completely communicate or understand between them and ourselves the underlying attachment is one of organic core – our natural, organic ghosts, per se. 

Now let’s consider why we get attached to inanimate objects. Most of the time it’s because of nostalgia or keepsake, or perhaps even habitual. These objects are not human, yet somehow we find some sort personal meaning in them. For instance, for months I rode a 11-year-old bike that was too small for me, and had a broken front derailed, severely misaligned rim breaks, an old chain, and a steel frame so heavy I’m pretty sure my upper arm strength increased significantly just from lifting it on occasion; yet I never had the heart to abandon it because I had so many biking memories attached to it (I even named it “Bikey” to commemorate my affection). Eventually, I had to invest in a new bike because the effort of pedaling up and down hills with Bikey increasingly irritated the tendonitis in my left knee, and unless I wanted to continue half-limping on foot I knew it was time to put Bikey in the garage (for the record, I named my current bike “BB”, only highlighting another tendency of mine to become attached to objects otherwise inanimate). 

This leads us to the last level which is on the verge of the uncanny valley: an intelligent artificial being constructed by our own algorithms and for our own purposes. Assuming that A.I. are capable of self-learning to an extent, the conflict is now a question of whether or not our own emotional reactions to them and their’s to ours have true emotional weight, or if we should abide by our own logic and merely consider them derivatives of our own being, tools that are anthropomorphized very closely to our likeness but nevertheless derivatives. 

This latter mentality is presented in Roger Ebert’s review of Stanley Kubrick’s and Steven Spielberg’s A.I. Artificial Intelligence, where he states

But when a manufactured pet is thrown away, is that really any different from junking a computer? … From a coldly logical point of view, should we think of David, the cute young hero of “A.I.,” as more than a very advanced gigapet? Do our human feelings for him make him human? Stanley Kubrick worked on this material for 15 years, before passing it on to Spielberg, who has not solved it, either. It involves man’s relationship to those tools that so closely mirror our own desires that we confuse them with flesh and blood…

Ebert brings up an interesting point, which is whether we impose and project our own beliefs and feelings upon what is otherwise an animate and well-programmed tool – a practice not too unsimilar to a child projecting their fantasies and adventures upon their doll or stuffed animal, for instance. There is also a question of a A.I. being so well-programmed as to detect our facial muscles twitch, contract and relax and react so appropriately human that they effectively trick us into believing their emotions are real, thus resulting in our illogical mentality of humanizing something that is nothing more than a extremely sophisticated tool. 

Do you remember that one that was constantly reading books? Well, when we got to the lab, the first thing the techs did was take apart its brain! It kind of seemed like that tachikoma liked it though. 

Oh, I see! Then, they were lucky enough to experience death…

Consider this: in Ghost in the Shell: Stand Alone Complex, Major Motoko Kusanagi and her team at Section 9 work with A.I. tanks called Tachikoma. As the series progresses, the Tachikoma increasingly develop more and more distinct personalities and have increasing tendencies to act independently despite orders from their users. Troubled by this, Motoko eventually halts use of the Tachikoma’s and has them sent back to the lab for further testing. However, as the series progress and only three remaining Tachikoma return to help Batou (Motoko’s closest companion amongst the Section 9 members), they eventually sacrifice themselves in order to save Batou’s life; and as Motoko looks on at their remains, she acknowledges that she mistakenly had them put out of commission, and even ponders if they had reached the state of creating their own distinct ghosts. 

While these questions are interesting to mull over, I believe the more important question is how we behave to an intelligent entity that is otherwise unbounded by our biological, organic limits of the flesh. We can argue to the end of time whether or not an A.I.’s “emotions” are real or not, and there can really be no way of knowing for sure; what we can assess is our own reactions, feelings and behavior when confronted with them. 

For an analogy, let’s consider video games: I’m not going to argue whether or not the medium is an art form, but I think we can all agree that all video games offer a virtual simulation of something – fighting, adventure, strategy, interaction, etc. The virtual environment is the product of programmers piecing together polygons into what conceptual artists conceived and writers hoped to flesh out within the constructs of a console or computer; algorithms and codes allow players to do whatever they want within the confines of the programmed environment, and with nothing short of individual A.I. and environments aspects for us to talk to or mess around with. Now, logic dictates that these virtual environments are nothing more but gateways for temporal detachment from our immediate physical environment; yet I dare anyone to claim that they did not experience something while running through the desert’s of Red Dead Redemption or confronting the likes of Andrew Ryan in Bioshock. 


The process of creating a videogame may be the greatest and grandest illusion ever created, but when finished, it holds the capacity to grant us experiences we can never experience. Loves we have never loved, fights we have never fought, losses we have never lost. The lights may turn off, the stage may go dark, but for a moment, while the disc still whirs and our fingers wrap around the buttons, we can believe we are champions.

– Viet Le, “The Illusionist

Video game players will always invest a certain amount of emotions into any game they choose to engage in. Whether it be placing your heart on Pikachu in Super Smash Brothers Brawl or wondering when the hell the story in Final Fantasy XIII is going to actually become interesting, there is almost a guarantee that these games elicit some emotional reaction from us – excitement, fear, frustration, sorrow, these emotions are real to us. Whether or not the game A.I.s share such sentiment is irrelevant, for we can only truly account for ourselves, and ourselves alone. 

So perhaps a robot or android may create the illusion of seeming more human than they actually are, or perhaps deep down their circuitry they perhaps do care about how we feel – we will never know. We can account for our behavior towards such A.I., and consider what exactly we feel entitled to in our given society and culture. 

In Time of Eve, there is a distinct political and social structure that discourages people from acting humanely towards androids, who are governed by the Three Laws of Robotics

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Additionally, all androids in Time of Eve are required to always display a halo projection above their heads, a marker that determines their subservient status. Constant propaganda ads spearheaded by the non-governmental Ethics Committee claim that sociable interaction between humans and androids are unhealthy, will end in disaster and possibly lead to the end of humanity; and it is neither uncommon for android owners to toss luggage at them without so much of a glance or unimaginable thank you, less they be deemed dori-kei and face ridicule from their peers. To be blunt, there’s nothing short of social norms or policy that enforces human and android segregation. 

Stepping back, the social and political structures in Time of Eve are not so unlike a democracy that deems segregation a norm. The most obvious example is that of Apartheid in South Africa, where a white minority democratically voted for segregation and lacking civil rights to their native African country. It took years for the likes of Nelson Mandela and other activists to end political mandate justifying racism, mostly because for years the empowered, minority white South Africans considered the social and political barriers a norm: by virtue of politics beginning from colonial times, caucasian Afrikaaners were obviously quite comfortable with their perceived birth right; it didn’t matter that their comfort and political representation was at the expense of a color majority – they had politicians to back up their views, and democratically so because for years the majority black Afrikaans were deprived of citizenship. 

The argument can be made that because androids are not human, we cannot treat them like how we would treat other fellow human beings. Perhaps this would be convincing if incarnations of it beforehand had not be used to justify injustice between fellow human begins beforehand: African slavery, European colonialism, the Holocaust – all atrocities against human rights twisted humanity into a sort of superiority complex, rationalizing entitlement rights groups of people believed they had above others. Furthermore, this argument structure again ignores the most pressing issue – how we behave as humans when dealing with individuals we are unfamiliar with. 

* Some may strongly stand by the divide between organic and inorganic beings, and state that since androids are artificial intelligence, we cannot equate such segregation to that between humans. If this is the case, then I offer this other example: if I were to equate androids to computers by virtue of them both being created as tools, our behavior is still indicative of ourselves at least. That is, if I take horrendous care of my MacBook and repeatedly drop it or fail to do simple maintenance on it, my MacBook may still operate and function but my carelessly reflects poorly of me regarding my behavior and lack of responsibility towards maintaining my computer; if I take excellent care of my MacBook (and I contest that I do), my MacBook may still operate and function but my maintenance and care for my MacBook reflects well of my abilities as a computer owner and responsibility towards it. 

In Time of Eve, policies and social structures against human-android interaction likely stem from public fear, distrust and insecurity culminating into a nationwide superiority complex, where it is absolutely normal for a person to feel superior than an android, regardless of the android’s intellectual and functional capabilities. As this negativity became more and more widespread, social structures morphed as well to accommodate such fervor, and eventually formed the policies which forbade human-android relationships from progressing into the uncanny valley of emotions and attachment. It’s considered taboo to humans to be humane to androids. Now given the social and political structures deeming inhumane behavior proper and normal, what does it mean when one chooses to or not abide by such norms? 

It takes no courage to act accordingly within social and political structures which provide you power at the expense of others’ dignity and civil rights; it takes an extraordinary person to break away from inhumane behavior otherwise deemed normal by a democratic majority, and especially speaks volumes about our ability to aspire towards a humanistic ideal above and beyond our dark, personal demons. Our emotions are our own, and if we feel an attachment to something otherwise illogical, then so be it – it is our right as humans, as well as our responsibility to act in the positive if we are to claim our rights to humanity. So if it means I’ll get laughed at for holding an umbrella over an android’s head, that’s fine by me. 

To be real is to be mortal; to be human is to love, to dream and to perish. 

– A.O. Scott

Recommended Articles

A.O. Scott’s Review on A.I. Artificial Intelligence

Roger Ebert’s Review on A.I. Artificial Intelligence

• ”The Illusionist“ by Viet Le

Armond White’s Review on District 9

• ”Not in defense of Armond White“ by Roger Ebert

District 9 Soundtrack - Main Theme by Clinton Shorter

*Edit @ 9:41pm - I forgot to add an important paragraph. It is italicized and is marked by a *

Scott Pilgrim – A Tribute to Arcades, 32-Bit, and A.D.D.

If there was ever a movie that truly captured the essence of the digital generation, Scott Pilgrim vs. The World is the pinnacle of it all. Movies that have targeted this generation include Superbad, Juno, Pineapple Express, Knocked Up, The Hangover, The 40-year-old Virgin, Anchorman: The Legend of Ron Burgundy, Garden State, Forgetting Sarah Marshall, (500) Days of Summer, Shrek – these comedy films collectively broke away from ‘80s archetypes of macho-nacho Arnolds and sexy-smexy Sharons. At the core, these films aimed to create more honest, more vulnerable characters on screen, presented either naturalistically, stylistically, or slang-slinging snarkily – regardless, they are all hilarious in their own right. Most of these films’ soundtracks are compilations of songs, each a flavorful (and invariably) pop culture tribute that listeners will catch here and there, further adding to the slice-of-life, down-to-earth susceptibility (or diabolically manic attitude) that these likewise movies try to depict. However, Scott Pilgrim goes where no movie of late has successfully gone before: to take every aspect of arcades, video games, internet memes, hackers and trolls alike, and throw it up on the big screen for everyone to see it in its ultimate glory. 

From here, I think it’s necessary to backtrack a bit – before Scott Pilgrim, before Facebook, before Wikipedia, before Google, before AOL, before MIDIs, before Windows '98 – the beginning of what we know as the digital generation. 

My dad is a hard drive engineer, and has been since I can remember when I started remembering. His occupation invariably resulted in me and my brothers having very early exposure to the desktop computer, which he had brought home from work. There wasn’t any word processing, media center, internet, anything what we would call absolute standard today: it was DOS, and I remember asking and learning how to log into the main screen (“dad-day, what do I type in a-gain?”) My brothers and I would fight over the computer so we could play some of those awesome games like SkiFree, Minesweeper, that submarine game I can’t remember, some Mickey Mouse game I can’t remember either – we fought and clawed at one another just to play a simple 32-bit games that were rather difficult (that damn Yeti would always murder us in SkiFree, and only recently have I learned how to overcome this obstacle. Blast!) My brothers and I remember our dad using those giant floppy disks that were “uber cool,” and that the concept of a portable computer, the laptop, was “way futuristic dude.”

We also remember the Nintendo NES and the subsequent Super Nintendo NES, and how we watched in awe as our friends played Mario on the non-flat, antenna-crowned television (“watch the turtle thing!”); how we were early observers of video sharing when my dad’s friends recorded multiple movies onto a tape (“keep rewinding – Beauty and the Beast is the second one on this video tape”); how much we begged our mom to let us buy and share one Gameboy mini (despite our desolate puppy eyes, we were rather unsuccessful on this front with our mother; however, my younger sibling managed to get away with it somehow and secluded himself with black and white Mario – I suspect my dad or his friend had a hand in this scheme); how addicting and costly it was to play your way through the entirety of a arcade game (one time, my brothers and I spent at least an hour playing this Simpsons game, and I suspect we dished out at least twenty dollars worth of change to stay alive and get our names on the hall of fame); when anime suddenly became mainstream pop culture when Pokemon hit the scene, with the trading cards and anime and movies and all (my old AP History teacher in high school compared the Pokemon trading cards to the stock market crash that caused the Great Depression, and boy was he right – I must have driven my mom crazy when I told her my card collection was over after months of trading and buying like the proper model consumerist); when the N64 duked it out the Playstation before the Xbox came into play; when getting on the internet involved jacking your phone line and annoying everyone in your family who needed to make a important phone call (“DUDE, get off the internet! Don’t make me phone jack you!”); when Nokia was still its own telecommunications company, and that everyone had a Nokia phone with customizable face plates (“mine has flowers because I am HIP like that”); and how it turned out that typing in queries with an added dot-com didn’t exactly get you to where you wanted (“let’s look up the white house – I’ll type in whitehouse.com and… AW GOD NO!”)

My brothers, my friends and I remember that transition period too, of cable television taking off; when pedestrian web designing became available (I remember being in awe of my friend when she managed to upload clips of Pokemon onto a website: “DUDE HOW DID YOU DO THAT?!”); when floppy disks were phased out by CDs; when MIDIs were not the only thing possible anymore when it came to streaming music on the internet; when flash and javascript started ousting html (and ultimately deterred me from pursuing web designing as an occupation); when mp3 players started showing up, years before Apple’s iPod launched; when DVDs started selling in Costco, modestly placed next to the aisles and stacks of video tapes; when our teachers started telling us to use Google instead of Metacrawler and the like (“it’s educational!”); and when the digital age became defined by the digitalization at hand, from games to encyclopedias to music to film to communication, everything. I can’t pinpoint an exact year when it happened, but at some point it did, and with a tour de force. Here is when the digital generation took off. 

What do I mean by the digital generation? I believe it to be inclusive of the generation that grew up with the early and current development of the world wide web and video games – essentially the generation that saw the transition from VCRs to DVDs, CDs to mp3s, newspapers to times online, and so on. Individuals of this generation don’t necessarily have to be involved with video games or the internet; rather, what I mean is that the digitalization of technology – of games and information – allowed this generation to research and look up infinite amounts of information at their very fingertips, and that this availability has, in a sense, caused an acceleration of intellect and A.D.D.-like symptoms – there’s just too much information to learn. 

This acceleration of intellect and A.D.D. compounds into an interesting mix: there’s almost a manic desire to prove oneself on the net, where your physical identity dilutes down into the avatar you choose, the style and language with which you write, and the subjects and discussions that you habitually gravitate towards. There’s snark, there’s trolling, there’s administration, there’s moderating, there’s polemic-ing, there’s wit, there’s extremism, there’s thoughtfulness, there’s intellect, there’s meme-ing – essentially anything is possible on the net, and you can define a anonymous identity simply by choosing which characteristics of the net to display, ignore, or engage in. 

This is where Scott Pilgrim comes in. If there’s one thing this movie does right, it’s paying tribute to the genesis of the digital generation: all the way back from SkiFree to Super Nintendo NES to Arcades to the manic identities of the internet, Scott Pilgrim is a celebration of all these qualities which define this particular generation – everything that makes the internet and video games awesome and stupid at the same time. 

You’re pretentious, this club sucks, I have beef. Let’s fight.


Talk to the cleaning lady on Monday. Because you’ll be dust by Monday. Because you’ll be pulverized in two seconds. The cleaning lady? She cleans up… dust. She dusts.

Scott Pilgrim is jumpy, bouncy, shiny, punchy, quirky – all in a glorious bundle of gaming honor and back-and-forth internet-style quips that critics bemoan as the downfall of intelligent and meaningful discussion. It’s like 4chan come to life, barfing up Pedobear and Philosoraptor into the vein of these characters as they duke it out in arcade-style arenas and consequences (in fact, some of the aesthetic reminded me of the glory days of Street Fighter and more recently Tatsunoko vs. Capcom). Vegans have psychic powers, music creates ferocious beasts, defeated opponents give you tokens, girls pull giant hammers out of their teeny-tiny purses, swords pop out of your chest – none of this makes sense, and trying to decipher their symbolic meanings is as pointless as a snuggie. You simply have to let everything explode around you in its fireworks display of green hair and ninjas, absorbing and digesting it all into a chyme of caramel popcorn and deep-fried twinkles. And hell, what a chyme it is. 

The film makes no attempt to argue whether or not video games are art (this is a wise choice, given how much flack Ebert received for claiming they are not). Instead, the film is a grand celebration of the frenetic energy that compels players to participate in the fun, the vigor and enthusiasm that results in rapid-fire remarks that can be uncannily hilarious or sarcastic or both. Scott Pilgrim celebrates everything that makes the internet great and terrible, intelligent and dumb, moralizing and demoralizing, and everything in between. Moreover, director Edgar Wright makes an effort to depict these gamer and snark characteristics in the positive – that while these characters are engaging in what judgmental critics deem as immature, insubstantial and trivial, they are still very much human and transitioning from the limbo of adolescence into something less immature, less insubstantial, and less trivial – or at least trying to. 

I don’t know how to explain this film except to say it truly is a great tribute to the digital generation, and that trying to understand the film logically is utterly useless. It’s a fervent display of colors and emotions, magnificently game-like and A.D.D. in its aesthetic. It’s an incredibly inventive film on multiple levels, and has established Edgar Wright as a favorite director of mine (his credits include Shaun of the Dead and Hot Fuzz). Michael Cera plays Michael Cera again, but at this point his less of an actor and more of a presence on screen that will either draw us to or drive us away from the film itself. Regardless, I enjoyed it thoroughly and was able to ignore Michael Cera being Michael Cera (reportedly he was portraying Scott Pilgrim; now I have not read the original comic books, but there’s a sneaking suspicion that this Scott Pilgrim is incredibly similar to George Michael from Arrested Development, Evan from Superbad, and Paulie Bleeker from Juno – but I may be projecting in this case). 

So imagine my dismay when I heard that Scott Pilgrim was deprived of the weekend box office success after getting sandwiched by two very gendered films: The Expendables and Eat, Pray, Love. It’s like a repercussion attack from the '80s and '90s– the macho-nacho Stallones rambling up the screen with He-man guns and explosions of the '80s, and the self-righteousness of faux feminists that the '90s otherwise defended as “female empowerment” when really, the movie-character of Elizabeth Gilbert could easily be one of the most unlikable characters to date (let’s be frank: do you really think you can reach a lifetime of enlightenment by simply taking meditation 101 for only three months, and then mosey off to some other paradise and have a bloody blastastic time?) The respective characteristics of these two decades that the digital generation actively rejects and departs from are, ironically, the same characteristics that crushed Scott Pilgrim from a successful opening weekend run. 

Of course I’m disappointed. There’s a slight bitterness as to how things turned out: I wanted – no, expected Scott Pilgrim to succeed, yet the '80s and '90s are still a haunting force that drives demographics to decide upon which movies to see or not. In this case, it seems that the digital generation isn’t strong enough to empower its cinematic representative, even in the prime age of digitalization. And with '80s remakes like The A-Team, I’m beginning to wonder if current political, economic and social constructions are starting to sway in favor of a decade that resulted in my university tuition being raised and financial aid being slashed, a decade which envisioned a trickle-down theory that has invariably failed on so many levels. 

I wonder these things with a slightly bitter heart, mostly because this digital generation – my generation – is still getting the axe from older generations. Scott Pilgrim celebrates the disco and bellbottoms of the digital generation, the meditation bongs of the internet and the folk songs of video games – not equivalently, but certainly analogous. So in a illogical, emotional way, the lackluster box office of Scott Pilgrim feels like the digital generation still isn’t getting the respect or recognition from moviegoers who’d rather see Sylvester Stallone blow up more stuff or Julia Roberts eat and consummate her way to happiness. 

I’ll fling all the RAGEEEE!!! and FUUUUUU!!! I want, but box office numbers are going to be the way they are. I can appreciate what Wright does with Scott Pilgrim, lavish in the memes and hax0rs of the digital generation, and find someone to borrow off the six graphic novels so I can start reading them in my spare time if case A.D.D. bounces me away from my current reading list of Anna Karenin and The Catcher in the Rye. However, when I see renown critics like A.O. Scott syntactically jumping up and down with glee in their beaming reviews of films that represent the digital generation, I can’t help but smile and feel a little bit validated. Just a bit, but just enough to feel just peachy whenever I feel like picking up the Wii to play some good ol’ Super Smash Brothers Brawl. 

Recommended Readings

• Two articles by Erich Kuersten on his criticism and defense of Scott Pilgrim (and Michael Cera)

Scott Pilgrim vs. The World Review by Emanuel Levy

Scott Pilgrim vs. The World Review by A.O. Scott

Greens, Fruit, and Candy - Hollywood versus Cinema

image

Back in 2008 I wrote a review on The Dark Knight, claiming that it was a “balanced, perfect chord that Nolan and his cast and crew [struck], a chord that few have touched or even come close to” and that the film “will be legendary by its own respect to the comic and movie medium, and moreover, by its respect for the general audience.” It was my first movie review, and I wrote this final statement without the same knowledge of film I possess today. Watching a unconventional superhero story unfold, being awestruck by Heath Ledger’s haunting performance, becoming enthralled with Nolan’s film noir-esque vision of Gotham – I wanted to defend this movie on a critical level immediately. Commercial success was inevitable, but I didn’t want the movie to become shanked* from the Oscars because of its undeniable popularity; I wanted to defend the movie on a intellectual level, a critical level so detractors and “film snobs” wouldn’t deny Nolan’s Batman lore of the credit I believed it deserved. 

It’s about two years later and I’ve stopped writing reviews regularly in favor of writing on this blog (also, these days I don’t have time watch movies expediently to write a relevant review). I know much more about film today, from its production to its history, and have even increased my regular online reading from Roger Ebert to the likes of Todd McCarthy, A.O. Scott, Emanuel Levy, Michael Phillips, and more recently Jim Emerson, David Bordwell and Dennis Cozzalio (amongst other writers who acquaintances and friends have introduced me to; I haven’t listed them because I usually read a minimum of three to ten articles – depending on the word count or subject – before citing them as regular reads). I’ve even stumbled across books and academic articles on Film Studies across the net, such as Bordwell’s generous free download of his book on Ozu and an entry linking academic papers on Nolan on the blog “Film Studies for Free.” The internet is a vast world out there, and persistent searching (coupled with a undeniably stubborn attitude that is possibly paired with procrastination) leads you to amazing finds. Most recently it brought me to some interesting commentary by film professor, scholar and critic, Emanuel Levy: 

How do you evaluate the artistic and popular dimensions of a particular movie year? For example, was last year, 2009, a good, mediocre, or bad movie year? When can you say with some degree of assurance and coherence that 1959 was a better year than 1958? And what will be the evidence to substantiate our claim that 1939 was the best year in Hollywood’s history? As a film professor, scholar, and critic, I have been struggling with this question for decades.

Levy’s comment and subsequent examples of successful films got me thinking on a tangential thought: how do you critique a commercially successful film? Critics and cinephiles alike talk all the time about independent and foreign films and how they oftentimes receive less attention than they deserve; but on the polar opposite, how are you supposed to talk about films that might possibly get more attention than anyone could foresee? 

I find the these two questions more difficult to answer since it’s easier to highlight the excellent qualities of an underdog film to a wider consciousness than to castigate the qualities of a well-funded, widely-distributed film that’s in the immediate public awareness to any effect. For instance, The Hurt Locker was the lowest grossing film to win the Oscar for “Best Picture” so far, and became well-known because of word-by-mouth reviews by acclaimed film critics around. Then you have the opposite fold, where films like Transformers 2 commercially succeed no matter how much you hack off the shiny lacquer of Megan Fox and Baysplosions hoping that people will realize the movie is a heaping pile of dung. 

image

Precious minutes of my life were wasted on this. These are moments where I wish I had a TARDIS.** 

Sometimes in my bitterness, movies like Transformers 2 make me wonder if people just prefer to throw away priceless seconds of their lives to see junk food excuses of cinema; but then I hold myself steady, take a breather, calm down and think “whoa there, buddy – people are smarter than that” and my optimism:pessimism ratio shifts back to the normal 55% and 45%, respectively. Call me naive, but I like to believe that people want to see good movies – why else would they going to theaters in the first place? 

Films are an experience, and personal ones too. They tap into our innate consciousness and subconsciousness, and oftentimes the films that we deem “personal favorites” are incredibly revealing of who we are as individuals. For instance, my list of favorite films currently includes Andrew Stanton’s Wall•E in 2008 and Charlie Chaplin’s City Lights in 1931: I loved the elegant and effective simplicity of the physical performances without (or with minimal) sound, notwithstanding the stories themselves which I found uplifting, charming and uncannily sweet. Obviously this sentiment wouldn’t carry on over to someone who’s primarily a fan of, say, Michael Baysplosions, but that goes to show how different and diverse our respective tastes can be. 

Now we’d all like to believe our favorite films are, in fact, great films. However, I prefer to amend this sentiment: favorite films and great films can overlap, but they are not necessarily the same. I say this because films are simultaneously about tastes and judgement. Now personally, I’d love to believe The Dark Knight is a classic, flawless movie that deserves a “great films” slot; however, I’d be in the purgatory of denial if I didn’t acknowledge legitimate criticism about the film’s flaws, and that while the film might be “awesome” that does not necessarily mean it is “great” (as stated by Stephanie Zacharek with regards to Nolan’s Inception). However, when I hear statements like these by Jeff Wells – that a commercially and critically successful film doesn’t need to be nominated for Best Picture to be great, and thus voters should vouch for films that are less noticed – I want to hit my head on the desk and write a letter to the Academy asking “what’s the bloody point of calling it ‘Best Picture of the Year’ if you’re just going to ignore commercial successes anyways?” My annoyance begins boiling again, but then I remember that the Oscars are always politically driven, and as A.O. Scott stated eloquently regarding the 82nd Academy Awards: 

The “Hurt Locker”-“Avatar” showdown is being characterized as a David-versus-Goliath battle, but melodrama and rooting interests aside, it is really a contest, within the artificial arena of the Oscar campaign, between the mega-blockbuster and the long tail. That last phrase, the title of a 2006 book by Chris Anderson, already has a bit of an anachronistic sound, but Mr. Anderson’s idea, shorn of some of its revolutionary overstatement, is still compelling. As digital culture makes more and more stuff available and spills it faster and faster into an already swollen marketplace, some works will establish themselves slowly, by word or mouth, social networking and serendipitous rediscovery.

That hypothesis is likely to be tested more strenuously than before in the movie world. The money to produce and publicize the kind of middle-size movie that has dominated the Oscar slates in recent years is drying up. Cheap acquisitions can be turned into hits — last year’s best picture winner, “Slumdog Millionaire,” being the most recent long-shot example — but there are likely to be fewer luxury goods for the prestige market.

Only one of the current crop of best picture candidates, “Up in the Air,” fits that description: it has a polished look, an established star, a literary pedigree and a medium-size budget. And it looks — all of a sudden, after a strong start in Toronto and in spite of perfectly good box office numbers — like an outlier, a throwback.

Which is to say nothing about its quality. The Oscars are never about that anyway. They are about how the American film industry thinks about itself, its future, its desires and ideals. Right now it is thinking big and small, trying to figure out how to split the difference, and hoping we will keep watching. Wherever and however we do watch.

Can a film be “awesome” and “great” simultaneously? More specifically can a film be commercially and critically successful at the same time? My naive self again would say yes, but would qualify the statement with an additional “–but very rarely does it happen.” I say “rarely” because when faced with inevitable commercial success, wide-release and blockbuster films are more prone to backlash for the very qualities that made it so successful and wide-appealing (note: when I speak about “commercial success,” I’m indicating films that pull in the box office numbers, which by extension is a numerical indicator of the film’s popularity amongst moviegoers, but not necessarily its critical reception). Take Juno, for instance: it was a hit at the 2007 Toronto International Film Festival, yet when it came close to Oscar season there was enormous backlash from people who felt that Juno’s pop-slanging shenanigans were unnatural and unrepresentative of how teenagers actually talk and act, and that the film sent a “immoral” message to teenagers about sex and teen pregnancy. 

I think Juno is a fine movie with slick, witty writing. But do I think it deserves a slot in “great films of all time” lists? In its own respect, I believe so, yes. To be perfectly honest, Juno is not exactly my cup of tea: I like Jasmine tea, but in the end I prefer the taste of Green tea simply because of my personal preferences – and in this case, Juno is Jasmine tea. No, this doesn’t detract away from my appreciation of the film; actually, it compels me to be even more holistic when looking at movies, and to make a conscious effort to differentiate (but not separate) between movies that I believe are great and movies that I personally adore to no end. So if I were to compile a list of movies I believe were “great,” I’d make an effort not only to appreciate films that aren’t necessarily my favorites and what they do well, but to also defend and argue for films that are my favorites if they are also included. This gives me room to relish movies that are cheesecakey goodness and include them in my list of personal favorites (i.e. Kung Fu Hustle) and extoll movies that I find artistically and technically outstanding which also happen to be in my list of personal favorites (i.e. My Neighbor Totoro). I like to imagine the differences, similarities and variance between “great” and “favorite” movies like this: great movies are your greens for cinematic fiber, favorite movies are your candy for cinematic sweets, and movies that great and personal favorites are fruit. 

image

I don’t bloody care what botanically correct scientists say: the tomato is still a vegetable in my books. And I will chuck it at whomever I wish to do so - Fresh or Rotten. 

We all want our favorite films to be fruit. But realistically you’ll have to admit that your personal favorites will not all be fruit – some will be candy no matter how much you believe otherwise (i.e. Caramel apple). At the same time, claiming that you only enjoy your films of greens ignores a lot of what films of candy and fruit offer. After all, films are also about entertainment: I could sit through countless art films and analyze the brilliance of the auteurs, but if I don’t feel compelled to re-watch it like a hyperactive child it’s unlikely the movie is going to be a personal favorite of mine. A recommended film, possibly, but probably not fruit, and definitely not candy. 

So how many films are actually fruit? Here, I decided to take a cue from Levy’s Four Criteria of Evaluation – 

  1. Artistic: Critics choices
  2. Commercial: Public choices, films that were popular with moviegoers—for whatever reason
  3. Innovative: Films that pushed the boundaries (technical, thematic, stylistic) and had impact on the evolution of film as a singular medium with new potential and possibilities
  4. Oscar movies: The five films singled by members of the Academy of Motion Picture Arts and Sciences (AMPAS) for Oscar nominations and awards.

– and then took a look at the list worldwide box office records of movies, and compared them respectively to their ratings on RottenTomatoes, Metacritic, and Imdb (note: click on the chart and graphs to see the full versions of these statistics, which were taken on 8/9/10; numbers are taken from here, though the numbers have changed with the addition of Toy Story 3 to the “Top 20” list as of 8/10/10): 

image

Now with my handy dandy Excel skills, I also made some graphs so we can visually see what’s going on here (titles are Y versus X values of graph; note the highest grossing film worldwide is the first value on the X-axis – essentially it goes from #1 to #20, left to right, respectively): 

image

image

image

image

All of the following findings are under the assumption that box office numbers are the best estimate we can get to seeing how “popular” a movie is – that is, how much people are actually compelled to dig into their wallets and see it for whatever reason, regardless of the critical reception before and after the film’s release. So if we were approximating a 70% as a generally favorable consensus after averaging the critical percentages of Rotten Tomatoes, Metacritic and Imdb for the current top grossers, we find this: only 14 out of the 20 listed films are generally favored by critics and viewers alike, which means that about 70% of box office grossers will be meet some amount of critical success – this group essentially consists of fruit and candy for movies, and that they could perhaps display a minimum of three (or two) traits out of the four criteria for evaluation that Levy presents. 

However, if we were to estimate what percentage of these movies would meet universal acclaim by approximating a minimum 85%, we would find the that only about 5 of the 20 titles could potentially be considered as “great films,” and that only approximately 25% of top grossers could actually be considered fruit and possibly possess all four qualities of Levy’s criteria. 

There are a number of things that could be wrong with these numbers. For one, the worldwide box office numbers in this data aren’t adjusted for inflation, hence the lists consists mostly of films that are relatively recent in film history. Another thing is that these numbers are based off worldwide gross, so it accounts for films that were fortunate enough to be released internationally in addition to domestically – this invariably favors films that get lots of funding from studios, and more often these studios are big time Hollywood players. Lastly, the review aggregates are from distinctly English-speaking, so the “universality” of critical acclaim possibly only covers the general Western hemisphere while not necessarily having the same appeal in the Eastern hemisphere (i.e. the reception of Danny Boyle's Slumdog Millionaire (2008) in America was much warmer than that of India, where the film was located in). Still, I think it’s important to see how box office numbers and critical consensus relate, and based on what I’ve found it all leads back to my original assertion: that it is rare for a commercially successful film to also meet universal critical acclaim. 

However, if we stand back from the standards of critical acclaim and consider more holistically to the standards of favorable acclaim, then my hope and naivete isn’t unfounded: a majority of the top office grossers aren’t bad films. They may not be great, but they’re not bad either; in fact, we could even conclude that they’re the perfect amount of candy and cheesecakey goodness that most moviegoers want when they go to the movies for whatever reasons – entertainment, escapism, evaluative, everything. 

image

Michael Bay presents Explosions! By Michael Bay. 

In the end, does it matter who got the higher score or raked in the most money? Frankly no; I’ll probably defend my favorite films until my deathbed and hate Transformers 2 for the rest of my life. That doesn’t mean I won’t find flaws or excellence in great, favorite and personal vendetta films either – in fact there’s a certain joy in finding things that could’ve been done better (or worse) in the films that you love or hate, almost as if you’re finding a Easter Egg that the filmmaker forgot they even put in. For instance, who knew you could so excellently incorporate sound effects “pew pew pew!” and classy comedy like humping dogs into a $200 million budget film created by full-grown, mature and enlightened adults like Michael Bay? It’s like they sympathized with my childhood where I counteracted my brothers’ reign of sibling terror with my pointed index finger after my mum told me to suck it up and fend for myself against their shenanigans (“no, I will not buy you a Nerf gun to assassinate your brothers with - go use a stick or something. And stop climbing the stair railing like a monkey - no I don’t care if you’re Quasimodo, you’re going to break off the railing!***”). 

We may never know what truly is “the best film of all time.” Lists will consistently bring up the same movies like Citizen Kane and Casablanca, but even then there’s a certain amount of subjectivism to any “greatest movies” list. Of course they’re always worth pursuing if the recommendation is compelling enough, but the decision is always personal and up to you alone. So while the probability of finding fruit might be low, it’s worth it after getting a palette full of greens and candy for films. 

image

*I was one hell of an angry cat when I found it The Dark Knight and Wall•E didn’t receive Best Picture nominations that year. Hell hath no fury like a person still without a cat.  

**Props to anyone who get this reference. Even bigger props if you’ve got your own sonic screwdriver. 

***I like to believe this was her loving way of telling me not to fall down to my peril and death. Regardless, I still climbed those stair railings because if anyone was going to be an awesome Quasimodo, it would be ME. 

Referenced Articles and Links (ordered with regards to this article)

The Dark Knight: 2008 - my first movie review

From Books to Blogs to Books - David Bordwell

Christopher Nolan Studies - posted and compiled by Catherine Grant

• 1960: Here is Looking at You Movie Year - Emanuel Levy

The 'Best Picture’ Academy Awards: Facts and Trivia - Filmsite.org

Is Inception This Year’s Masterpiece? Dream On - Stephanie Zacharek

Will 'Wall-E’ be be nominated for Best Picture at the Oscars? - Tom O'Neil of Gold Derby (this is a compilation of critics’ opinions on Wall-E’s prospects of a “Best Picture” nomination; search for Jeff Wells to find his quote)

The Politics of an Oscar Campaign - Peter Bowes

• Huge Film, Small Film: Big Stakes - A.O. Scott

Jumping the snark: The Juno backlash (backlash) - Jim Emerson

All Time Worldwide Top 20 - The-Numbers.com

Recommended Articles and Links (no particular order)

• Masterpieces: How to Define Great Films? - Emanuel Levy

The Top Film Criticism Sites: An Annotated Blog Roll - compiled by Paul Brunick of Film Society of Lincoln Center

The Fall of the Revengers - Roger Ebert 

I’m a proud Braniac - Roger Ebert

Fade In Magazine - A nice magazine that looks into the nitty gritty workings of Hollywood business and the film industry

Superheroes for Sale - David Bordwell

Hey, Wall-E: Shoot for the Top (Great animation deserves shot at Best Picture) - Joe Morgenstern

Trivia: Can The Dark Knight Win the Best Picture Oscar as a Write-In Candidate? - David Chen

Just for Fun (because comedy is the best relief for bitter film memories)

Rifftrax: Transformers 2 – BATTICAL!

Michael Bay presents: Explosions! – courtesy Robot Chicken

Michael Bay Finally Made an Art Film – Charlie Jane Anders of io9

Cat Safety Propaganda - How I reacted when I learned Wall•E and The Dark Knight did not get “Best Picture” nominations (see, cute little girl is the oppressive hand of the Academy and its innate biases against animation and comic book lore, and the cat is a cat the hell hath no fury like when it is angered… I’m not sure where I fit in here except that the cat’s reaction to cute little girl is more or less how I acted when I heard the Oscar news those years ago)