The Playful and Ephemeral Opening of "Paprika"

I rewatched Paprika not too long ago (within a week’s time period, I believe) and couldn’t get over how visually astounding and beautifully done the opening sequence was animated. It establishes the feel and aesthetic of the film in a mere two minutes, introducing us to the lovely and bubbly Paprika and her physical (?) counterpart Atsuko without saying anything. There’s a playfulness to it all, a characteristic all too appropriate to Paprika’s upbeat nature. There’s also a rather surreal element to it, where Paprika goes in and out of, well, anything: her ephemeral presence is both light and warm, unrestrained by the physics of reality and free to bound from place to place, sky to sky, street to street, and person to person – and all just as seamlessly like a dream. Here’s a video link to the opening scene and some following screenshots to help me describe what director Satoshi Kon does so exceedingly well – which, if anything, is an excellent example of dynamic and creative transitions: 

In this cut with Detective Konakawa flipping over Paprika’s business card, there is a warm orange hue that really begins the introduction to Paprika’s character – unique and outgoing, and warm too. 

This next cut is a nice transition from the business card to the highway street, where Paprika drives on by on her scooter. Though she is center there is a lot of movement going on here so the framing doesn’t seem stagnant. 

This first fluid transition (where Paprika goes from one place/medium/whatever to another) is probably the most unexpected one, since at this point most first time viewers aren’t aware of Paprika’s dream nature. Again there is a prevalent orange color with the rocket ship, as well as a playful wink at hyper-cartooned anime aesthetics. 

Here’s a nice little fluid transition of Paprika (literally) blasting off into the night sky…

… and as she flies through the night sky she fluidly transitions again, this time into two advertising billboards (I’m not sure if this is product placement on Kon’s part, as I’m not entirely familiar with Japanese products/beer…)

This is really the first rough-cut transition, where we’re not quite sure how Paprika ends up in the monitor of this (exhausted) employee. 

(Notice, too, how Paprika is wearing a white dress quite similar to that of the woman’s attire in the sleeping employees photo, just left of the monitor. A nice little detail I only caught while taking screenshots of the opening sequence!)

As she begins tiptoeing away, Paprika begins to bounce into bigger and bigger strides, leading to this fluid transition below…

I love how Kon depicts Paprika as a bouncing entity, a characteristic that not only emphasizes her playfulness but more importantly the dream nature of her existence, one that so cheerfully bounces around within the colder barriers of reality. Notice, too, how her red-orange shirt contrasts with her blue surroundings, again emphasizing the warmth of her presence. 

This is sort of a fluid transition from the office building, where prior to seeing her bounce on the street we saw her bounce her way through the hallways. This is another playful and humorous moment in the opening scene, where in her frustration she simply lifts up her hand..

…snaps her fingers…

… and stops traffic like that!

The effect here, of course, is one of really establishing a sort of surrealism regarding Paprika’s presence in the real world, a presence that is almost God-like in some respects. She simply bounces around carelessly, free of worry or fear of breaking physical or anything that otherwise bars everyone else chained to the physical world. The aesthetic effect is also creative, echoing back to the days of pausing video cassettes (DVD kids be damned!)

Here is the second rough-cut transition in the opening sequence, where we suddenly see Paprika biting into a hamburger. We also see her reflection in the background mirrors, which are a rather important part of this brief (and comedic) scene: 

As the two guys hit on her, we see Paprika’s mirror reflections express disgust in different ways to a comedic effect (a cascade of rejections, I must say!) The interesting part is that like the sleeping employee prior, Paprika is interacting with people in the real world; however, the difference here is that two guys are also interacting with her as well, just like a real-life conversation. The multiple mirror reflections reveal multiple dimensions of Paprika, dimensions that otherwise don’t reveal themselves in the real world – a theme that is prevalent throughout the entire film. 

This fluid transition here is probably one of the most creative I’ve seen in the entire movie. As Paprika runs out of the eatery to get away from two suitors, she manages to disappear by jumping into a man’s shirt and then jumps out back onto the screen, looking straight at us the viewer. This jumping in and out of frame recurs constantly throughout the rest of the film, too, where in your dream state you can really do anything you want, just like a God – the only limit is the span of your imagination. 

(The transition is inventively reminiscent of Cinderella, Paprika style)

This is only shot that doesn’t involve Paprika, establishing the real world at hand and the sunrise encroaching. It leads to the third and last rough-cut transition below…

…where we see Paprika again riding on her scooter on the high way. 

We see Paprika get behind a car, and as a car passes by the camera we see that she is now driving the car. 

Another car passes by the camera and we see that is now another woman driving. Her hair down and flowing in the air, another car passes by the camera for the final fluid transition…

… and introduces us to Atsuko, her hair now tied up and no longer flowing in the air. 

One of the nicest things about this opening sequence is that besides being astoundingly creative, it really utilizes the stretches the potential of animation to its limit: Paprika is a dream avatar, and by such she can really do anything we otherwise wouldn’t even think of in the real world. It also gives us a nice transitional introduction to Atsuko, Paprika’s physical counterpart, and gives us a lot of hints regarding how both characters are similar and drastically different one another. 

Paprika is truly one of the most visually astounding films to date, and probably one of the best films regarding dreams as well. It definitely isn’t one of the more accessible films; in fact, I think it’s even more complex than the linearly drive narrative of Inception (if you’re one of the people who said “holy mind f**k” at the end of Nolan’s film, you’ll probably find Paprika absolutely incomprehensible). Regardless, I couldn’t recommend this film enough for anyone, especially for animation enthusiasts and fans of psychological/dream-themed films – and I do hope this opening scene will bump hesitant film fans to take a leap of faith into the magnificent mind of Satoshi Kon. 

The Nature of a Comic Strip

Calvin and Hobbes was the first comic series I truly remember reading when I was still a kid. Me, my brothers and mum used to bunch together in the bookstore and giggle incessantly at Bill Watterson’s jokes, the cleverness and absurdity that one boy and his tiger could endeavor upon in the course of four panels and the Sunday special. Calvin and Hobbes effectively opened up my world to drawing, storytelling, humor and intellect, where even now I’ll flip through one of the anthologies and realize how incredibly timeless some of the strips are. 

Lately, I have on occasion picked up one of the books laying around the house and read through some comics, and I noticed something interesting: newspaper comic strips consistently break the fourth wall, very much in tune with classic Looney Tunes and the sort. 

Azumanga Daioh by Kiyohiko Azuma

Very rarely have I come across a comic strip series that doesn’t outright address the reader with some punchline; in fact I think the very medium is limited so the cartoonist must employ the fourth wall aspect. The comic strip is much different comic books, graphic novels and manga in that it is limited by the number of panels, a resultant effect due to the nature of traditional newspaper print; additionally, since each strip is published daily, storylines often only run as long as two to four weeks due to the constraints of panels, printing and lining up with Sunday comics, which often take more time to print since newspapers must also process color. One exception to this storyline limit is Azumanga Daioh, the Japanese yonkoma series that detailed the everyday life of six Japanese high school girls until their graduation. 

In comics, continuity is something cartoonist often play around with. The comic strips For Better or For Worse by Lynn Johnston and Baby Blues by Rick Kirkman and Jerry Scott featured continuous storylines where characters aged and past events factored in consistently with the present comic strip. While these comics may not necessarily age with our current timeline (Kirkman and Scott have stated that the timeline of Baby Blues is 1/3 slower than our current timeline) these comics are devoid of the chronological inconsistencies that would otherwise result in a paradox. 

However, most newspaper comic strips feature ageless/un-aging characters for simplicities and serialization’s sake, employing varying degrees of continuity that meet the needs of the story and the punchline. Environments commonly change to keep the art interesting and relevant to the season, and perhaps the technology too. 

Garfield by Jim Davis

Pearls Before Swine by Stephen Pastis

A pie chart on how you may have killed Kenny. 

Comics like Garfield and Pearls before Swine rely on minimal continuity save consistent characters with consistent quirks and faults (in Swine, the dynamic between the gullible Pig and the egomaniac Rat is a staple to Stephen Pastis’ rhetoric; in Garfield, Garfield’s sarcasm and his owner Jon’s haplessness are also a staple to Jim Davis’ serial, though it has become stale as of late). This lacking continuity factor is similar to how Looney Tune characters would bludgeon and bash each others skulls in and return the next day fine; a modern and more extreme example of this lacking continuity factor how the character Kenny dies spectacularly in episodes of South Park, only to miraculously be back, alive and well subsequent episodes (this trend decreased drastically after the episode “Kenny Dies”). 

The Far Side by Gary Larson

Comics like Bill Amend’s Foxtrot maintain the same character caricatures whose dynamic between one another depends on what is relevant during the time of the comics publication – it is a zeitgeist, effectively. By nature, comics like these are probably the best indicators for pop culture phenomenas during a given era: for fun, I recommend you go your local bookstore and look at older anthologies of Foxtrot and see how anachronous some of the references are these days (I still trip out when looking at old comics where Paige talks about Madonna tapes). You also have single or multi-panel comics that are political or social satires, or even one bit punchlines that hit you like that – with or without dialogue. Comics that come to mind include The Boondocks by Aaron McGruder, The Far Side by Gary Larson, or even caption contests held for The New Yorker

Pogo by Walt Kelly

Then you have unusually special comics that have ageless characters and never explicitly depict anything that marks the comic strip’s specific publication date, all save minute details like a nondescript TV set or generic bicycle. Two of the best examples I know include Walt Kelly’s Pogo and Bill Waterson’s Calvin and Hobbes, where the characters express ideas that are universal and timeless, effectively making the thematics, messages and punchlines of the comic strips (ironically) ageless. In Calvin and Hobbes, Waterson has never depicted Calvin’s parents using a specific type of technology like a computer or wearing a specific fashion style; in Pogo, Kelly used anthropomorphic animals using basic equipment like a medicine bag only distinguished by its red cross.

Waterson’s and Kelly’s nonspecific style of drawing – in which characters are distinguished solely by basic physical features and their personalities – is similar to how Christopher Nolan used very little technology in Inception to keep the timeline of the film nondescript: the architect Ariadne constructs dreams first by hand rather than by computer; dream worlds are based on real life urban and natural landscapes like Paris and Los Angeles; phones, monitors, laptops or any dateable piece of technology never make a cameo once throughout the entire course of the movie – it’s all paper, pencil, and classy suits. 

Despite the difference in continuity, nearly every comic I’ve mentioned (if not all) has broken the fourth wall in communicating directly with the reader what the character feels or reacts to the main punchline or reveal. It’s effectively a “That’s all folks” and “What’s up doc?” wink that breaks the confines of print and ink. It’s an active acknowledgement that the character knows they are being observed, and by directly addressing the reader the cartoonist effectively breaks down the wall of separation between artist and viewer, a barrier that is commonly inherent to all artistic mediums (especially narratives). 

The Narrator in Into the Woods

Malcolm in the Middle

This fourth wall break is rarely accomplishable in any other medium besides newspaper comics and cartoons. The most original employment of the fourth wall (and breaking of it thus) is in Stephen Sondheim’s broadway musical Into the Woods, where the narrator who up until his timely death, is dragged into the story that he narrates to the audience and subsequently dropped 100 plus feet from the ground by a vengeful giantess (thus his timely death). The TV show Malcolm in the Middle also broke the fourth wall consistently with the main character Malcolm (Freddie Muniz) frequently looking at the character to narrate exactly how he is feeling at a certain moment before turning back to the events taking place, with everyone else oblivious to his narrative. Beyond Sondheim’s musical masterpiece and Malcolm in the Middle I can think of little else that has successful pulled off the fourth wall (and breaking it) without being corny, dimwitted or unappealing sarcastic. 

Newspaper comic strips are unique like that. They present to us a visual way of reading and taking in an artist’s version of the editorial or humorist column. It’s a medium that compromises a lot to fit everything into four panels six days a week and a set shape for Sundays, or to fit as much substance and punch into a mere panel for the surest and hardest hit as possible – and hell, what a job it is. 

For the record, the job apparently includes staring at empty space for long periods of time before coming up with an idea – something I’ve come to empathize with as of late. 


Also, this happens to be the 100th overall tumblr entry since I began back in May 18th, 2010. Thanks to everyone who’s been reading! :)

TRON: Legacy

image

I had the good fortune of watching TRON: Legacy this past weekend in IMAX 3D, which is probably the best way to watch any film that is otherwise being shown primarily in 3D (otherwise you can call yourself ripped off by what is otherwise a marketing gimmick). Just a year ago I saw some sneak peek clips and pictures of the film, and was blown away by the sheer design of the costumes and sets. At that point I hadn’t yet seen the original TRON (to date I have as of late) and knew little of the universe except that there were speed bikes (much in thanks to numerous Family Guy parodies); all I knew was that regardless of the story, TRON: Legacy would be astoundingly beautiful in its construct. And now, having seen the cult sequel in its IMAX 3D glory, I can say my original assessment is far from wrong. 

What TRON: Legacy excels at in its technical and artistry it equallylacks in its screenwriting and cohesiveness. As Emanuel Levy commented about director Joseph Kosinski, who graduated from Columbia with a BA in architecture, he has an excellent sense of design but a poor instinct for story – all which makes TRON: Legacy easily one of the most beautiful and swindled movies in awhile. 

The original TRON and its sociocultural significance

image
image

The original TRON and new TRON: Legacy posters side-by-side

When it first came out in 1982, the original TRON was breathtaking and momental because it effectively marked a emerging era of computer-generated special effects in cinema (ironically, it was no nominated for that year’s Academy Award for Special Effects because accordingly, the academy members felt it used too much computer effects). TRON became a cult classic because effectively, it really was the modern sci-fi movie was we know it today, and one that focused on an emerging computer science for the matter. At the time, TRON truly was a movie of the future, grounded in RAM and coding and programming and all. 

TRON’s story wasn’t entirely spectacular, but it was nonetheless cohesive in distinguishing what represented what (programs, deletions, commands, etc) and how these bits of binary language corresponded and interacted with the physical world. The master command program (MCP)  was prime administrative program of ENCOM, CLU was Kevin Flynn’s (Jeff Bridges) hacking program, and TRON was the security program of Alan Bradley (Bruce Boxleitner). It was all very straightforward, and for the computer savvy a rather remarkable way to visually represent what goes on inside the circuitry of electrical engineering. 

At its very core TRON heavily echoed of the McCarthyism sentiment pervasive in the American public during the Cold War. Visually, the authoritative, antagonistic program personas were highlighted in red to contrast against the non-authoritative, subservient programs. Rhetorically, the movie is as heavily anti-Communist regime as it can get: the sole antagonist, MCP, rules tyrannically in the virtual world of computer programs, destroying any of those below him who refuse to follow his direct orders. The recurring rhetoric of “master command program,” “the MCP” throughout the film becomes so heavily ingrained that its hard not to draw parallels to Soviet Russia’s “Big Brother” regime. 

This very McCarthyistic sentiment is what helps tie together an otherwise conventional story in the original TRON, which is quite a feat considering the astounding special effects easily took away from the character development of the human characters. Still, considering the budgetary issues and technology at the time, I’d say TRON was truly a remarkable film of its time. 

The stand alone sequel

image

Image from the original test footage of TRON: Legacy

With the success of the test footage shown at the 2008 Comic-Con, TRON: Legacy was greenlit by Disney for a full-feature production, heralded by commercial director Joseph Kosinski (his short feature for the make-believe product, iSpec, helped get him the spot as a excellent visual director). The new film would feature TRON veterans Jeff Bridges and Bruce Boxleitner as Kevin Flynn and Alan Bradley, respectively, as well as new faces like Garett Hedlund as Flynn’s son Sam and Olivia Wilde as Quorra, a brave warrior program (much to the chagrin of many fans, the only veteran protagonist of the original TRON who did not make an appearance was Cindy Morgan, who played Dr. Lora Baines, whose program counterpart is YORI). You could easily get away with watching TRON: Legacy without having seen its predecessor TRON beforehand – just don’t complain if you end up not knowing what exactly the programs TRON and CLU really are by the end of the stand alone sequel. 

TRON: Legacy began as an experiment to see if audiences would react warmly to a revamp of a classic 1980s cult movie icon, a tone which effectively set the mood for the rest of the film’s production. Many of the original pieces from the predecessor are present (light bikes, disc games, program removers, solar train, the physical-to-virtual-zapper-thing), but there have been a significant number of additions to the virtual world since its debut twenty-eight years ago: now there are light cars, light planes and jets, skyscrapers, music programs (Daft Punk makes a nice cameo), and an undeniably cruel follow-up antagonist, CLU 2.0. 

The sequel film’s story revolves around Sam’s accidental appearance into the virtual world constructed by his father, primarily called “the grid,” and his attempts to get his father Kevin back to the physical world while fending off the sadistic reincarnation of Kevin’s program counterpart CLU. If you think I’ve spoiled everything well, I have news for you: I really haven’t, because the story isn’t what TRON: Legacy is holding on to for its core cohesiveness – it’s effectively a beautiful designer movie, with a bolts-and-locks story to tie together scenes into something theatrically presentable. You could say it suffers the same problem the film 9 or, for anime-savvy fans, Final Fantasy: Advent Children were riddled with throughout; luckily for Kosinski, though, is that his actors – particularly Bridges, Hedlund and Wilde – do their best with the script, and manage keep us tuned in while they’re at it. 

There’s no denying that besides the visual and designing excellence of the TRON: Legacy universe, the soundtrack by Daft Punk (internationally known for their hit single “Harder, Better, Faster, Stronger”) is probably one of the best aspects of the entire film. With a strong electric aesthetic alongside the orchestral virtuosity of the London Symphony Orchestra, the TRON: Legacy soundtrack is a sure contestant for Grammy awards and very possibly Oscar nominations for Best Soundtrack. So strong is the auditory presence of Daft Punk that in some cases, the scenes on screen are effectively a music video complement of the electronic duo’s mastery (not that this is a bad thing either). In my mind, Daft Punk has effectively rivaled – if not overshadowed – Hans Zimmers heavily electronic work for Inception, which is quite remarkable since to date, I think Inception is one of Zimmers’ best works to date. 

A potential squandered

image

An aged Kevin Flynn, reprised by Jeff Bridges

The most disappointing thing about TRON: Legacy is that it fell short of a potentially excellent story. While I have the fortune of being familiar with the original TRON, by the way Disney advertised for TRON: Legacy it seems that the company was aiming for a wider audience, one that wasn’t necessarily familiar with the films predecessor (I wouldn’t be surprised either if many audience members were unaware that TRON: Legacy was a sequel to begin with). Now, while this familiarity (or lack thereof) the original 1982 film won’t necessarily help your understanding of the newer reincarnation, being familiar with TRON does help you appreciate how much the newer film has accomplished technically. 

That aside, the story of TRON: Legacy could have easily been one of the most interesting narratives for consideration had Kosinski and his production team done a better job at writing and piecing together all the scenes into one cohesive. For one thing, there’s a certain point where virtual entities no longer become clearly defined by their physical world counterpart like in the original TRON. For instance, what exactly does a club mixer and its host Zuse represent? Moreover if the isomorphs that Kevin Flynn so adamantly describes as the answer to all of man’s issues with religion, science, medicine – well to begin with what exactly are the purported isomorph’s real-life counterpart? (My guess would be somewhere along the lines of totipotent/stem cells, but that’s for another debate). What helped keep the first TRON together was that there were very set entities inside the virtual world, all of which were very well defined by their purpose and their actions and commands which could drawn to a physical world counterpart (control, alt, delete – repeat!) In TRON: Legacy, these distinctions and purposes are less well-defined, and effectively every thing you see is, well, a virtual reality that somehow represents the computer world – how it does I won’t endeavor for fear of a self-induced headache of a electronic existential conundrum. And even if for those familiar with the original TRON, it became confusing as to what the hacking program CLU and security system TRON had morphed (or not?) into this newer computer universe – what exactly was going on, virtual and real-world wise? 

Ultimately, this lacking distinction and definition of the TRON: Legacy universe is what contributes to its vision over cohesion end product. Moreover the rather overdrawn mono–/dialogue scenes do little service to any character development (save the acting caliber of the cast members) and renders pretty much all the pro– and antagonists on screen into cliches, caricatures even. CLU (played by Jeff Bridges, whose younger appearance is achieved with CGI magic)  is a cruel villain, but beyond that there is little else that defines him save ultimatums (“I’m going to build the perfect world!”); Kevin Flynn (also played by Bridges) is remorseful, thoughtful and talented, but besides the occasional bright spots of the beloved Bridges (“you’re messing up my zen, man!”) there seems to be a lost spark of energy that Bridges can so easily play; Wilde is, like Roger Ebert says, a beguiling Quorra, though it is unfortunate that the screenwriter Alan Sorkin allows her little else than wide-eyed curiosity and warrior ferocity; and Hedlund plays an appropriate Sam Flynn, though what else he could add to an otherwise rebellious-individual-covering-up-childhood-pain typecast in the spread-thin story is beyond me. What we end up is a beautiful film with a story too weak to uphold its visual grandeur – an unfortunate effect for what is otherwise one of the most anticipated science fiction films to date. 

Comparing and contrasting the themes of TRON and TRON: Legacy

image

CLU and his henchmen 

As I stated earlier the original TRON was defined heavily by its McCarthyistic sentiments prevalent during the time of its production. When I first heard about TRON: Legacy I wondered what would thematically tie it together, whether or not the writers would opt for the now archer anti-Soviet sentiment or indulge in typical action-over-substance mentality endemic to Hollywood these days. Instead, I was most surprised by the Greek mythology-inspired thematic that despite being poorly fleshed out, was a core component of the new TRON: Legacy (warning: spoilers begin here – skip forward to ‘end of spoilers’ if you haven’t seen the sequel): 

Throughout TRON: Legacy, there are repeated references to the mythical presence of 'users’ (effective us humans who type away at our keyboards) in the virtual world of the Grid, a presence which is embodied by both Kevin and Sam Flynn when they both get stuck in computerized universe. CLU, who is derived from Kevin, serves as authoritarian ruler of the Grid after forcing Kevin to flee from the main cityscape of the Grid. With this in mind, we can equate 'users’ like Kevin and Sam Flynn to be like flawed Greek gods and CLU to be analogous to a Greek demigod; everyone else in the grid is effectively a mortal that the user Gods and the demigod CLU have authority over (to varying degrees, of course). Like the Wachowski’s Brothers Matrix trilogy, the TRON franchise emphasizes heavily the authoritarian nature of a centralized artificial intelligence and celebrates the human-centric resilience against something that is otherwise immortal, efficient, and inorganic. In lieu of Flynn’s comments at the climax, the driving philosophy of the TRON universe is that in the end, imperfection is the ultimate perfection because “true” perfection is unattainable. This is the definitive philosophical (and perhaps ethical) difference between the virtual gods (users) and the virtual demigod CLU. 

CLU, feeling betrayed by Kevin Flynn after the ENCOM visionary decided to take in the isomorphs (which I still don’t’ really understand), tries to turn the world of the Grid against the virtual gods, the users if you will, and simultaneously tries to become a user by taking Flynn’s info disc and attempting to get to the virtual-to-physical portal. As a virtual demigod, CLU simultaneously despises and desires something of the virtual gods (the users) – all stemming from the fact that CLU was born from Kevin Flynn, and feels that Flynn is an imperfect component to his core objective of creating a perfect system – a virtual utopia, per se. 

image

The ever beguiling Olivia Wilde, who portrays the brave and inquisitive Quorra

Quorra, on the other hand, dedicates her purpose to helping the users, effectively asserting her loyalty to what is otherwise a 'greater’ entity in the virtual world – the users, or the true gods of the Grid. In fact, at one point Quorra tells CLU that he does not belong with the users, for she has seen and understands full well that the world of the universe is beyond what CLU can even fathom. Her drive is not one of greed or ambition, but of curiosity and a desire to understand things outside her field of knowledge – she is, effectively, a virtuous, virtual mortal who is granted the gift of becoming a virtual god (a user) at the very end. The program TRON, too, utters the famous line “I serve the users” at the very end, thereby cementing the sort of higher being significance of user inside a virtual world. In short, there’s a rather distinct hierarchy of users (virtual gods), AI/master programs like CLU and TRON (virtual demigods) and branching programs/functions (virtual mortals) that is prevalent in the TRON: Legacy universe – all of which is unfortunately not fleshed out due to a severely lacking screenplay. Moreover, by ending the movie with Quorra now a real-life human after originating from the virtual Grid – well, that begs to question how exactly Kevin Flynn linked the physical and virtual world together, and what exactly Quorra’s presence in the physical world means for, well humanity – questions that are effectively left as open-ended and frustrating as those unresolved by The Matrix Revolutions (I also have issues with what happened to TRON in the same vein, but less so than the ending scenes with Quorra). 

There is, however, an interesting contract between the first and second films of the TRON universe. The first film was defined heavily by its anti-communism, anti-authoritarian sentiment that, by extension, was largely a sentiment of a pro-capitalist and anti-socialist American public. In the second film, Sam Flynn annually hacks into the Encom company; this time, he distributes the company’s OS (Encom OS 12, previously known as Flynn OS) for free on the internet, his core beliefs (like his father’s and of Alan Bradley’s) being that software should be available to students and users alike. Perhaps it is a result of the democratizing force of the internet, but Sam’s actions are, at their core, inherently socialist: by freely distributing Encom OS 12 onto the net Sam effectively goes against the capitalist free market – a rather direct and indirect contrast to the attitude of the first TRON, where Kevin Flynn was trying to prove a copyright violation in order to get his share of the commercial success of his programming. 

(end of spoilers) 

Boiling it all down 

image

TRON: Legacy is a beautiful film, plain and simple. The unfortunate thing, however, is that its story is simply too weak to adequately support the astounding visuals that flash across the screen. 

Would I see it again? There’s a good chance yes; like Avatar, sometimes I can’t get enough of a film that aesthetically astounds me (and if there’s a second time, I’ll try and see it in traditional 2D to see how much brighter the colors are). 

There are serious cutting and editing flaws that result in an oddly paced film (I wish there were more light bike scenes and that the film editor had a better sense to cut out the dialogue that felt too stiff pace-wise), and for those unfamiliar with computer science basics or even the predecessor film TRON: Legacy can be a maddening and annoying experience to endure. 

The film is, nonetheless, breathtaking in its own right. I enjoyed it for what it did right, and guessed correctly where it would falter otherwise. For those who adhere strictly to Alfred Hitchcock’s film philosophy of “story, story, story,” stay away from Kosinski’s directorial debut; for those who enjoy a visual fest, love Daft Punk, like good science fiction or just something beautiful to see, by all means TRON: Legacy is a excellent candidate. Just don’t get me started on how it received  PG rating despite the virtual violence, virtual drinking, virtual sex symbols and virtual deaths that occur throughout the entire film – all because there wasn’t any real blood, because it was all virtual. 

image

Japanese poster for TRON: Legacy

Recommended Reading

TRON: Legacy movie review – Emanuel Levy

TRON: Legacy movie review – Todd McCarthy

TRON movie review – Roger Ebert

TRON: Legacy movie review – Roger Ebert

Recommended Clips

Daft Punk’s “Derezzed,” special trailer presentation

Fanmade trailer of TRON: Legacy, titled “Rerezzed”

2008 Comic-Con TRON: Legacy teaser

• World ofTRON: Legacy featurette

• TRON: Legacy – behind the scenes of the TRON vehicles

TRON: Legacy Innovative Design Featurette

Official style of TRON: Legacy Featurette

TRON: Legacy – CLU featurette

Why Rated-G isn't an excuse for poor craftsmanship

image

Having just re-watched Wallace and Gromit and the Curse of the Were-Rabbit, I was reminded that rated G films and programs do not need to entail idiotic gags that cater to the stupidest of stupid. Like Sesame Street, Classic Disney and more recently Pixar, Nick Park and Aardman Animation have proven again and again that just because you’re aiming for a G-rating doesn’t mean your jokes can’t be any less intelligent, the scares any less horrific, or the innuendos any less implied – it’s all in good taste, of course.

Why is Wallace and Gromit such a gem of a canon? Quite simply it makes no pretense of being anything more or less than it is – a lovely and adventure-filled universe with the ever inventing Wallace and the master of silent acting Gromit. As I’ve said before¹ Nick Park’s Wallace and Gromit universe is undeniably sweet and unimposing in its presentation, entirely grounded in the reality of the mundane amidst the spectacular. We see Wallace’s outrageous inventions, and he acts as if they’re perfectly normal even when they malfunction (the most he’ll do is shake his hands back and forth in front of his face and go “oh dear…!” if it gets really bad); and as per usual, good old Gromit is there to pick up the pieces after something goes haywire, or if Wallace simply wants some cheese and crackers. This is a universe untainted by true malice, the most negative of feelings arising solely from somebody after something not uncommon, and in the most hilarious manner as well (Victor and his toupee? Check!)

Throughout the entire movie I was laughing hard, amazed (and still am!) by the cleverness and subtle jokes Park and his production team snuck in that makes both children and adults will laugh at, and for very different reasons. This is what a good movie is all about!

So why is it that so many children’s films these days seem so content to sit and stagnate in a pool of screenplay mediocrity? It’s almost as if a majority of animation studios took a cue from Disney Channel, focusing on a false extraordinaire and always on the seeming verge of being prescribed Ritalin. Everyone wants to be Hannah Montana, shiny and bright and in-your-face fun! There needs to be drama, action, something to drive everyone on screen to constantly on a energy high as if the world will stop spinning if they’re not displaying the effects of a caffeine overdose. No one seems to be content with quieter aspects of childhood, the moments where we find a little caterpillar walking on slowly by and begin imagining just what it might be up to.

image

What makes Wallace and Gromit so exemplary is in addition to the series outright rejecting the A.D.D. mentality that is endemic to most recent American children’s productions, it is inherently good-natured and calm. No one gets too excited, too sad, too angry, too – well, anything. The characters take it all in stride – from being dragged underground by a giant rabbit to two dogs having a aerial dogfight, or even the fact that everyone has over-the-top security for their precious vegetables – nothing quite seems to push anyone into the realm of clichéd or hopeless desperation: we all know it’s going to be all right in the end, just you watch! (also, could you make some tea and crackers while you’re at it?)

The narrative structure of Wallace and Gromit is one of adventure, balancing a Looney Tune’s cartoon spectacle with characters who at most will say “oh dear” when something goes haywire. There are feats only imaginable in the realm of animation (stop-motion animation, for the matter) that defy logic so unapologetically without so much of a wink – well, you’ll just have to accept that Wallace insists on contraptions and machines to perform the most menial of tasks, and that two dogs will take a moment to shuffle through their coin pouches while they’re fighting over a plane ride, because at the end of the day you know they’ll all be sitting down for cheese and crackers.

The only other director-animators I know who are completely at peace with being unspectacular are Hayao Miyazaki and Isao Takahata of Studio Ghibli, and Sylvain Chomet who made The Triplets of Belleville and the upcoming The Illusionist. There’s a distinct calmness to Miyazaki’s, Takahata’s and Chomet’s works that refuses to indulge in instant anything, and refuses even more to cue us into how we should be feeling. The musical scores are not composed to elicit a specific emotion, but beautifully supplement what happens on screen at a given time; there are, at times, when minimalism and silence are the most important sounds for a given scene. Nick Park takes this distinct calmness a step further by framing something otherwise amazing and fantastic into something almost regular, expected and humorously mundane even. Park’s directing technique is not dissimilar to that of Joel and Ethan Coen’s, especially in the duo’s exemplary Fargo in which a intense kidnapping and series of gruesome murders tied to a extensive extortion scheme are somehow hilarious and really, really uncool (the word “yeahhh” will never be the same for me again). 

Rated G doesn’t need to be dumbed down. Classic Disney films like Pinocchio and Snow White are as rated G as they come, and there are still scenes that I find traumatizing and disturbing (to date, the donkey transformation scene in Pinocchio still gives me chills)². Pixar has inexplicably established themselves as an institution open to challenge and change³, where the creative process is inherently tied to feedback, feedback, and more feedback. Even Sesame Street has demonstrated exemplary intelligence in broadcasting to primarily children: the PBS programming has repeatedly shown themselves to be updated, intelligent, and sensible in communicating sociocultural, political, and even pop culture aspects with children (the Old Spice spoof⁴ is one of my favorites parodies to date).

I can only hope this sugar-high mentality that many recent children’s films will have at least diluted a bit by the time I end up raising a kid, and hope even more that this stupid 3D enthusiasm will have kicked the bucket. We need smarter, better writers for children, writers who are sensitive enough to know what strikes a chord in both kids and adults alike, and timelessly so. In television, we’ve had Sesame Street, Animaniacs, Foster’s Home for Imaginary Friends (Craig McCracken), and old Spongebob Squarepants (when Stehen Hillenburg was still involved); in film, we’ve got classic Disney, Pixar, Sylvain Chomet, Hayao Miyazaki, Isao Takahata, Henry Selick and Nick Park; in books, we’ve got Lewis Carroll, Beatrix Potter, J.M. Barrie, Dr. Seuss, Roald Dahl, Beverly Cleary, Avi, and now J.K. Rowling – so, anyone else up for the challenge?

image

Referenced and Recommended Reading/Links

¹The Simple Sweetness and Sincerity of Wallace and Gromit

²The Mythology of Classic Disney

³”It Gets Better,” Love Pixar

Smell Like a Monster

• Wallace and Gromit and the Curse of the Were-Rabbit – review by Roger Ebert

• Cookie Monster audition tape for SNL

• Ricky Gervais and Elmo – off camera bantering

The Kid’s are All Right: Ramona and Beezus – by Dennis Cozzalio

Be Human

To be real is to be mortal; to be human is to love, to dream and to perish.

- A. O. Scott

There’s been a lot of buzz on the internet lately post-The Social Network about whether or not the internet is a bane to our humanity. Enthusiast say it allows connection beyond physical limits, and that it is democracy in the dark; detractors say it allows us to release innate bestial behavior that we’d otherwise control in the physical world, and that it’s a endless sea of voices dumbing down one another. 

After some mulling over time, I realized almost no argument defined one very important term – what is humanity, and what is it to be human? 

This thought popped up after reading Richard Brody’s counterargument¹ to Zadie Smith’s Generation Why?² piece. Having read Zadie first before Brody, I could see why he takes so many issues with the Zadie’s seeming diatribe, who writes:  

How long is a generation these days? I must be in Mark Zuckerberg’s generation—there are only nine years between us—but somehow it doesn’t feel that way… I often worry that my idea of personhood is nostalgic, irrational, inaccurate. Perhaps Generation Facebook have built their virtual mansions in good faith, in order to house the People 2.0 they genuinely are, and if I feel uncomfortable within them it is because I am stuck at Person 1.0.

Brody then lambasts Smith on this angle, saying: 

The distinction between “People 1.0” and “People 2.0” is a menacingly ironic metaphor: following Smith’s suggestion that those in the Facebook generation have lost something essential to humanity, it turns “People 2.0” into, in effect, “People 0.0”—inhuman, subhuman, nonhuman. That judgment is not merely condescending and hostile, it’s indecent.

On this level I couldn’t agree more with Brody, not only because the 1.0 vs 2.0 distinction arrogant and subjective – where’s the cut off to begin with? – but Smith falls upon the sword by failing to define what she considers human to begin with. To be fair, neither has Brody, but his response is a critical one, and rightfully so. 

Perhaps Smith didn’t intend for the 1.0/2.0 distinction to be interpreted this way; maybe she simply meant our brains are wired differently. According to Nicholas Carr’s The Internet Makes Deep Thought Difficult, if Not Impossible³, 

The Net delivers precisely the sensory and cognitive stimuli – repetitive, intensive, interactive, addictive – that result in strong and rapid alterations in the brain cell

Regardless, the very notion that somehow the internet age generation is “less human” because of how we operate or even how our brains are wired is a condescending one, conservative even. It implies some sort of overarching normality and human ideal that, frankly, is not universally applicable to how people operate on a day-to-day basis. To claim that those engaging on the net are “less human” implies that we cannot exist as distinct entities if we are not tied to our physical forms, period. 

I’ve written before⁴ how much I disagree with this assessment, primarily because the body-existence relationship theory implies you somehow become “less human” if certain bodily functions cease to exist (e.g. unable to eat/drink/talk, lost of appendages). An existence is influenced and defined to an extent by the environment, but it has been proven repeatedly over and over again that this existence can be extended beyond the shell of a body it occupies. The very action of writing down anything is already extension of someone’s existence: books were one of the earliest gateways to a alternative reality, and television, film, and other media followed suit. The internet is no different. 

To be fair on Smith’s end, she makes a good point about how the internet has become less personal in some respects: 

When a human being becomes a set of data on a website like Facebook, he or she is reduced. Everything shrinks. Individual character. Friendship. Language. Sensibility.

Brody rightfully criticizes her for this very phrase, but let’s put her rather negative parsing into this context: remember the earlier days everyone was sort of building their own websites, their own way online journaling via whatever they could? Before wikipedia, YouTube, hell even Google – we were all trying to build our own realm, experimenting with different media and limits of the internet at the time. I remember my friend Dominique once showed me some clips of Pokemon she’d managed to upload onto her dot-com site, and how much that impressed me; over ten years later and we’re posting minutes of clips on YouTube, no problem. I used Geocities (RIP) to create a online portfolio of writing, drawings, and even a journal; now there’s flickr, tumblr, more platforms than I could’ve ever imagined. There are now distinct hubs of websites, many long established during the dot-com bubble – Amazon for shopping, Photobucket for photos, and Google for pretty much everything now these days. 

There is of course some weight to what Smith says here: there have been multiple psychological tests and research done to see how much empathy people can show for one another depending on the degree of separation. For instance, in light of the Holocaust, the elephant in the room question was this – how could ordinary people enlisted in the Nazi regime even perform such atrocities? It turns out that multiple things were at play: primarily the threat of death and harm to their family kept many members in check, regardless of their personal thoughts and philosophies on the idea of subjugating Jews to such depravity and genocide; it also turned out that simply by not seeing someone you are about to harm or even kill, people are more likely to follow orders (i.e. “the person in the other room is a prime subject, so you need to keep pushing the electrify button every five minutes or every time they deny they are a liar”) because of the physical disconnect from the person they are engaging with. This explains why we get so many trolls and inflammatory remarks on the internet, which is a sorrow symptom of a media that allows us to be physically disconnected from actually seeing the person we may be engaging with. If this is the angle Smith meant when she said “we are reduced,” then yes – there is the potential for us to be less civil and collected given the chance to simply detach ourselves from physical sociocultural restraints. However, Smith goes on to say that the very existence of the net makes us less human in terms of individual character – all which I so fundamentally disagree with because like Brody, I find it arrogant and hostile. It’s one thing to say a degree of separation enables us to be more belligerent and in poor taste (even bestial, with the recent surge of cyber bullying), but it is another to claim no good can come of the media either, and that to subscribe to the net is to subscribe to an existence of hallow emptiness. 

It’s true that institutionalizing websites has made the internet less personalized, and that to subscribe to such a categorical organization is giving up certain programming and web designing idiosyncrasies otherwise expressed outside the determined parameters of a search engine, mail client, or social networking tool. Smith’s wording, however, is much too heavy for an assertion without a real foundation: we’re still ourselves on the net, and while we may share similarities to one another these similarities do not detract away from how we are as individuals. Standardization, albeit gray-scaling things down a bit, allows us to connect more easily to others, and then it’s possible to use such a connection as a segway to another piece of the internet that is perhaps more personal to ourselves, and ourselves alone. 

A perfect analogy for how the internet has transitioned is that of the radio. When it began everyone was for themselves: you played with radio waves, made your own messages, relayed back and forth between different frequencies – it was a free-for-all. However, as talkers began to be established and distinct stations became in place, this free-for-all diminished and everyone began subscribing to different programs of their liking. 

If we were to go by Smith’s standards, the radio equally made everyone “less human” than their predecessors once stations began their broadcast domination over people from playing around with frequencies and wavelengths. With this additional perspective, I couldn’t agree more with Brody’s response: 

Smith’s piece is just another riff on the newly resurgent “what’s wrong with kids” meme that surges up from the recently young who, in the face of rapid change, may suddenly feel old… Democratization of culture implies democratization of influence. Biological age, years of study, and formal markers of achievement aren’t badges of authority any more; they haven’t been for a long time. In school, in universities, they do remain so, and Facebook arose in response to, even in rebellion against, the social structures of university life that were invented for students by their elders the administrators on the basis of generations’ worth of sedimented tradition. Who can blame Zuckerberg for trying to reorganize university life on his own terms—and who can blame those many young people, at Harvard and elsewhere, who found that his perspective matched, to some significant extent, their own world view and desires? And if Facebook also caught on even among us elders, maybe it’s because—in the wake of the exertions of the late-sixties generation—in some way we are now, today, all somehow still in a healthy state of struggle with institutions and traditions.

No matter how you look at it, the internet truly is democracy in the dark. Democracy does not necessarily mean people do or believe in the positive – Apartheid is a perfect example of democracy in the negative – but it is simply an avenue for voices to be heard, and for the truly strong, competent and remarkable individuals to shine even brighter amongst the screams and laughter of the net. To claim that the internet somehow makes us dumber⁵ ignores the very external institutions that have collapsed so badly (the American education system is riddled with so many problems that I’m astounded still why the federal government insists upon school curriculum determined by the state), and ignores the fact that the internet, no matter how you look at it, is an extreme representation of our very selves which have inherently been shaped by the very institutions and policies formed in the non-virtual world. To claim also that media somehow makes us “less human” (made famously by Chuck Klosterman in this interview⁶) is, like Brody says, incredibly inappropriate and condescending, and again ignores a grave, fundamental definition – what does it mean to be human in the first place? 

Lastly, to add my two cents: not once does she directly define what it is to be “human,” claiming insofar that people 1.0, “real people,” are simply not people 2.0, “the Zuckerberg generation.” This is where and why Smith’s argument (and rant, for the matter) falls apart from the very beginning, all because she chooses to define terms in opposition rather than by definite terms. Defining in opposition is all relative: if you say “I’m against those who are sexually deprived,” you’re not defining what sexual depravation is, and merely implying some subjective grouping of those you consider “sexually depraved”; conversely, if you directly say “I’m against the philosophy of free love because I believe in monogamy,” the terms of “sexual depravity” are more clearly defined, and while these parameters are subjective there is at least a constant the argument can fall back onto. 

So to you, Professor Smith: to be human is to have irrational desire to love and be loved, regardless of how these emotion, illogical actions may entail for our physical well-being. There is a universal sense of our mortality and the gateway to death that awaits us at the undefined chronological end. So no matter how much the world changes or what new innovations happen to come our way, or even how our behaviors may fluctuated with pulsating sociocultural upheavals and revisions humanity will always be present so long as we have a sense of illogic in dealing with the world around us. I am not 2.0 person as you are not a 1.0 person: we are both in the same game, only I do not crave the nostalgia and older institution you are more familiar and comfortable with. Let us all agree that the change is one of the few constants in life, all in addition to life and death. 

Irrational love defeats life’s programming. 

– Andrew Stanton


Referenced Reading: 

¹Plus on Change – by Richard Brody

²Generation Why? – by Zadie Smith

³The Internet Makes Deep Thought Difficult, if Not Impossible – by Nicholas Carr, published in the magazine “Discover Presents: The Brain,” published Fall 2010

Ghosting

Dumb and Getting Dumberby Janice Kennedy

The Chuck Klosterman Interview Part 2 – conducted by Hunter Stephenson of /Film