Be Human
To be real is to be mortal; to be human is to love, to dream and to perish.
- A. O. Scott
There’s been a lot of buzz on the internet lately post-The Social Network about whether or not the internet is a bane to our humanity. Enthusiast say it allows connection beyond physical limits, and that it is democracy in the dark; detractors say it allows us to release innate bestial behavior that we’d otherwise control in the physical world, and that it’s a endless sea of voices dumbing down one another.
After some mulling over time, I realized almost no argument defined one very important term – what is humanity, and what is it to be human?
This thought popped up after reading Richard Brody’s counterargument¹ to Zadie Smith’s Generation Why?² piece. Having read Zadie first before Brody, I could see why he takes so many issues with the Zadie’s seeming diatribe, who writes:
How long is a generation these days? I must be in Mark Zuckerberg’s generation—there are only nine years between us—but somehow it doesn’t feel that way… I often worry that my idea of personhood is nostalgic, irrational, inaccurate. Perhaps Generation Facebook have built their virtual mansions in good faith, in order to house the People 2.0 they genuinely are, and if I feel uncomfortable within them it is because I am stuck at Person 1.0.
Brody then lambasts Smith on this angle, saying:
The distinction between “People 1.0” and “People 2.0” is a menacingly ironic metaphor: following Smith’s suggestion that those in the Facebook generation have lost something essential to humanity, it turns “People 2.0” into, in effect, “People 0.0”—inhuman, subhuman, nonhuman. That judgment is not merely condescending and hostile, it’s indecent.
On this level I couldn’t agree more with Brody, not only because the 1.0 vs 2.0 distinction arrogant and subjective – where’s the cut off to begin with? – but Smith falls upon the sword by failing to define what she considers human to begin with. To be fair, neither has Brody, but his response is a critical one, and rightfully so.
Perhaps Smith didn’t intend for the 1.0/2.0 distinction to be interpreted this way; maybe she simply meant our brains are wired differently. According to Nicholas Carr’s The Internet Makes Deep Thought Difficult, if Not Impossible³,
The Net delivers precisely the sensory and cognitive stimuli – repetitive, intensive, interactive, addictive – that result in strong and rapid alterations in the brain cell.
Regardless, the very notion that somehow the internet age generation is “less human” because of how we operate or even how our brains are wired is a condescending one, conservative even. It implies some sort of overarching normality and human ideal that, frankly, is not universally applicable to how people operate on a day-to-day basis. To claim that those engaging on the net are “less human” implies that we cannot exist as distinct entities if we are not tied to our physical forms, period.
I’ve written before⁴ how much I disagree with this assessment, primarily because the body-existence relationship theory implies you somehow become “less human” if certain bodily functions cease to exist (e.g. unable to eat/drink/talk, lost of appendages). An existence is influenced and defined to an extent by the environment, but it has been proven repeatedly over and over again that this existence can be extended beyond the shell of a body it occupies. The very action of writing down anything is already extension of someone’s existence: books were one of the earliest gateways to a alternative reality, and television, film, and other media followed suit. The internet is no different.
To be fair on Smith’s end, she makes a good point about how the internet has become less personal in some respects:
When a human being becomes a set of data on a website like Facebook, he or she is reduced. Everything shrinks. Individual character. Friendship. Language. Sensibility.
Brody rightfully criticizes her for this very phrase, but let’s put her rather negative parsing into this context: remember the earlier days everyone was sort of building their own websites, their own way online journaling via whatever they could? Before wikipedia, YouTube, hell even Google – we were all trying to build our own realm, experimenting with different media and limits of the internet at the time. I remember my friend Dominique once showed me some clips of Pokemon she’d managed to upload onto her dot-com site, and how much that impressed me; over ten years later and we’re posting minutes of clips on YouTube, no problem. I used Geocities (RIP) to create a online portfolio of writing, drawings, and even a journal; now there’s flickr, tumblr, more platforms than I could’ve ever imagined. There are now distinct hubs of websites, many long established during the dot-com bubble – Amazon for shopping, Photobucket for photos, and Google for pretty much everything now these days.
There is of course some weight to what Smith says here: there have been multiple psychological tests and research done to see how much empathy people can show for one another depending on the degree of separation. For instance, in light of the Holocaust, the elephant in the room question was this – how could ordinary people enlisted in the Nazi regime even perform such atrocities? It turns out that multiple things were at play: primarily the threat of death and harm to their family kept many members in check, regardless of their personal thoughts and philosophies on the idea of subjugating Jews to such depravity and genocide; it also turned out that simply by not seeing someone you are about to harm or even kill, people are more likely to follow orders (i.e. “the person in the other room is a prime subject, so you need to keep pushing the electrify button every five minutes or every time they deny they are a liar”) because of the physical disconnect from the person they are engaging with. This explains why we get so many trolls and inflammatory remarks on the internet, which is a sorrow symptom of a media that allows us to be physically disconnected from actually seeing the person we may be engaging with. If this is the angle Smith meant when she said “we are reduced,” then yes – there is the potential for us to be less civil and collected given the chance to simply detach ourselves from physical sociocultural restraints. However, Smith goes on to say that the very existence of the net makes us less human in terms of individual character – all which I so fundamentally disagree with because like Brody, I find it arrogant and hostile. It’s one thing to say a degree of separation enables us to be more belligerent and in poor taste (even bestial, with the recent surge of cyber bullying), but it is another to claim no good can come of the media either, and that to subscribe to the net is to subscribe to an existence of hallow emptiness.
It’s true that institutionalizing websites has made the internet less personalized, and that to subscribe to such a categorical organization is giving up certain programming and web designing idiosyncrasies otherwise expressed outside the determined parameters of a search engine, mail client, or social networking tool. Smith’s wording, however, is much too heavy for an assertion without a real foundation: we’re still ourselves on the net, and while we may share similarities to one another these similarities do not detract away from how we are as individuals. Standardization, albeit gray-scaling things down a bit, allows us to connect more easily to others, and then it’s possible to use such a connection as a segway to another piece of the internet that is perhaps more personal to ourselves, and ourselves alone.
A perfect analogy for how the internet has transitioned is that of the radio. When it began everyone was for themselves: you played with radio waves, made your own messages, relayed back and forth between different frequencies – it was a free-for-all. However, as talkers began to be established and distinct stations became in place, this free-for-all diminished and everyone began subscribing to different programs of their liking.
If we were to go by Smith’s standards, the radio equally made everyone “less human” than their predecessors once stations began their broadcast domination over people from playing around with frequencies and wavelengths. With this additional perspective, I couldn’t agree more with Brody’s response:
Smith’s piece is just another riff on the newly resurgent “what’s wrong with kids” meme that surges up from the recently young who, in the face of rapid change, may suddenly feel old… Democratization of culture implies democratization of influence. Biological age, years of study, and formal markers of achievement aren’t badges of authority any more; they haven’t been for a long time. In school, in universities, they do remain so, and Facebook arose in response to, even in rebellion against, the social structures of university life that were invented for students by their elders the administrators on the basis of generations’ worth of sedimented tradition. Who can blame Zuckerberg for trying to reorganize university life on his own terms—and who can blame those many young people, at Harvard and elsewhere, who found that his perspective matched, to some significant extent, their own world view and desires? And if Facebook also caught on even among us elders, maybe it’s because—in the wake of the exertions of the late-sixties generation—in some way we are now, today, all somehow still in a healthy state of struggle with institutions and traditions.
No matter how you look at it, the internet truly is democracy in the dark. Democracy does not necessarily mean people do or believe in the positive – Apartheid is a perfect example of democracy in the negative – but it is simply an avenue for voices to be heard, and for the truly strong, competent and remarkable individuals to shine even brighter amongst the screams and laughter of the net. To claim that the internet somehow makes us dumber⁵ ignores the very external institutions that have collapsed so badly (the American education system is riddled with so many problems that I’m astounded still why the federal government insists upon school curriculum determined by the state), and ignores the fact that the internet, no matter how you look at it, is an extreme representation of our very selves which have inherently been shaped by the very institutions and policies formed in the non-virtual world. To claim also that media somehow makes us “less human” (made famously by Chuck Klosterman in this interview⁶) is, like Brody says, incredibly inappropriate and condescending, and again ignores a grave, fundamental definition – what does it mean to be human in the first place?
Lastly, to add my two cents: not once does she directly define what it is to be “human,” claiming insofar that people 1.0, “real people,” are simply not people 2.0, “the Zuckerberg generation.” This is where and why Smith’s argument (and rant, for the matter) falls apart from the very beginning, all because she chooses to define terms in opposition rather than by definite terms. Defining in opposition is all relative: if you say “I’m against those who are sexually deprived,” you’re not defining what sexual depravation is, and merely implying some subjective grouping of those you consider “sexually depraved”; conversely, if you directly say “I’m against the philosophy of free love because I believe in monogamy,” the terms of “sexual depravity” are more clearly defined, and while these parameters are subjective there is at least a constant the argument can fall back onto.
So to you, Professor Smith: to be human is to have irrational desire to love and be loved, regardless of how these emotion, illogical actions may entail for our physical well-being. There is a universal sense of our mortality and the gateway to death that awaits us at the undefined chronological end. So no matter how much the world changes or what new innovations happen to come our way, or even how our behaviors may fluctuated with pulsating sociocultural upheavals and revisions humanity will always be present so long as we have a sense of illogic in dealing with the world around us. I am not 2.0 person as you are not a 1.0 person: we are both in the same game, only I do not crave the nostalgia and older institution you are more familiar and comfortable with. Let us all agree that the change is one of the few constants in life, all in addition to life and death.
Irrational love defeats life’s programming.
– Andrew Stanton
Referenced Reading:
¹Plus on Change – by Richard Brody
²Generation Why? – by Zadie Smith
³The Internet Makes Deep Thought Difficult, if Not Impossible – by Nicholas Carr, published in the magazine “Discover Presents: The Brain,” published Fall 2010
⁵Dumb and Getting Dumber – by Janice Kennedy
⁶The Chuck Klosterman Interview Part 2 – conducted by Hunter Stephenson of /Film