neurobiology

Alzheimer's and Algernon

If there was one disease that was truly terrifying from a philosophical perspective, Alzheimer’s is an indisputable contender. Neurodegenerative, the disease slowly but surely robs victims of their dignity, destroying the very essence that years of invaluable experience culminated into. 

Philosophy takes for granted that the human condition cannot meteorite: presumably our minds continue to pulsate and change, and that these pulsations result in our very being – a unique entity within the blistering swarm of the universe. Philosophers took for granted that uniqueness could not somehow be degraded from its essence, that it was an all-or-nothing relationship: existence or death. Yet Alzheimer’s has proven otherwise, manifesting into one of the slowest and cruelest ways of degeneration. Most other diseases eat away at your body, some perhaps resulting in insanity or dementia; Alzheimer’s, however, simply does not destroy you physically, but it must also shed away your personality and mind until nothing is more than sensationless pulp. 

There have been books written and films made about Alzheimer’s – perhaps the most famous and critically acclaimed is Away from Her – but I have always been curious about the sort of despair and devastation a patient feels when the very characteristics that defined them become impossible to act out. Most of the time the patient is unaware of their own degeneration; some know their predicament, and that it is only a matter of time before their own sense of being is really no more. I know families of patients perhaps suffer more than the patient (especially since in the worst stages, they are blissfully ignorant of their state of being), but I do wonder – what on earth is it like to suddenly realize you’re falling down, down and down from a pillar you once proudly stood upon, the pillar which effectively defined your own existence? 

Such is a question beautifully and poignantly explored in the novel Flowers for Algernon. The premise is this: a mouse, called Algernon, successfully undergoes experimental surgery to artificially improve its intelligence. Charlie, a mentally disabled man, volunteers for the treatment in hopes of becoming more intelligent. Charlie’s treatment is also successful, and he becomes exponentially intelligent to the point of outclassing the finest minds in the world. However, Algernon begins deteriorating, and very soon it’s obvious that Charlie will also meet the same fate. 

The book is exceptional not only because of the moral and ethical dilemmas it presents such as treatment of the mentally disabled or how academia interacts, but especially by how it is written and presented. The book’s structure is one of a journal, supposedly maintained by Charlie when he first opts for the treatment all the way to his fantastic intellect and then to the beginnings of his decline, where he eventually stops writing because he is afraid and devastated by the idea of documenting his deterioration any further. 

I can only imagine that perhaps Charlie’s fall from intellectual greatness experience is perhaps analogous to that of an Alzheimer’s patient. For the first time in his life, Charlie exudes a intellect so utterly spectacular, so magnificent that he truly feels a sense of pride in himself. However, the side effects of the experiment kick in, and like the plaque and dying neuron effects of Alzheimer’s Charlie finds himself losing more and more of himself everyday; more horrifying is the prospect of falling from greatness and back into state even more mentally handicapped before, and possibly brain death for the matter. These last journal entries are devastating: the anger, the despair, the desperation, and finally the acceptance – there’s almost a cruel irony that the greatest genius on earth should perish as a vegetable. Understandably, Charlie leaves his journal before he begins documenting in a more degenerated state. This last move is Charlie’s desperate effort to maintain himself somehow, to leave a documentation that chronicles the intelligent Charlie did exist, and while his now handicapped self can still remember such; further recording would only show how this Charlie was replaced by another Charlie, and another, and so on. Denial? Perhaps, but in his situation wouldn’t you opt for the same thing? 

Perhaps a more interesting question to consider is this: is there a certain point where our physical or mental degeneration effectively makes us a completely different person than previous? That is, is our existence an all-or-nothing or gradient? With Alzheimer’s, I feel that the all-or-nothing model fails to account to the nuances and accumulating changes an afflicted individual encounters; many of the earliest symptoms are mistakenly attributed to senility, but as the symptoms become more and more frequent it becomes obvious that something is amiss – and that’s when the diagnosis comes in (interestingly, doctors can only confirm Alzheimer’s by autopsy). By the time a diagnosis is made, it’s only a matter of time until the person you know and love is no longer there, effectively snuffed out of their existential essence and ghost. 

The analogy of Flowers for Algernon to Alzheimer’s is nothing else than my own projecting. I’ve never had relatives or friends afflicted with the disease; the closest experience I had with Alzheimer’s was back in July 2009 when I volunteered at an Alzheimer’s clinic, where I simply kept patients company and interacted with them so they wouldn’t be left all day to watch nothing but television. Yet I still wonder the sort of distress (or lack of) one feels when slowly but surely they become less and less themselves. 

I recently read a Times magazine article titled “Alzheimer’s Unlocked.” Detailing current developments and advances against the disease, the article optimistically stated that with recent medical imaging techniques like advanced MRI machines, doctors and researchers were now able to visualize pieces of anatomy and physiological pathways in the brain that before, were completely out of the question with traditional dissection techniques. The biggest hope is that more avenues of research will open up, and that now we can really see what else we might have missed in researching the disease: traditionally, many believe that plaques formation corresponded to Alzheimer’s development, but to what extent this relation is (direct or indirect) or if there is another (or several other) physiological mechanisms at hand is the more recent question at hand. 

Surprisingly, the article did not state perhaps why Alzheimer’s occurs in the first place, beyond the physical fact that some are genetically predisposed to it. I wonder, though, if the disease itself is perhaps a natural, inherited mechanism to shut down the human body when it begins to seem that our physical forms are no longer reproductively viable or as energetically sustainable – possibly, Alzheimer’s is almost a way of slowly shutting down a physical system that is simply too old. 

This is all theory, of course. The only basis for it is that Alzheimer’s is considered a disease of the elderly, while other diseases like Parkinson’s, Tuberculosis and cancer can afflict anyone at any point in life, and afflictions at birth such as Down Syndrome are the effects of genetics seen immediately. Perhaps Alzheimer’s is just a genetic affliction triggered by the mere physical state of being elderly, and that if certain aspects of the environment trigger chronic stress (a constant firing of the sympathetic nervous system with little chance of the parasympathetic nervous system to balance it out) only accelerate the aging process, Alzheimer’s manifests as a way to simply shut down the now overworked body. 

We may not know for many more years, or very easily a finding could reinforce or completely disprove what I’ve just out here. However, I’m sure we all agree on one thing – that Alzheimer’s unequivocally destroys any sense of being we might have ourselves, ghost and all. 

Recommended Reading

Flowers for Algernon – Daniel Keyes

Away from Her – Roger Ebert movie review

Charly – Roger Ebert movie review

Away from Her – A.O. Scott movie review

New Research on Understanding Alzheimer’s – Alice Park of Time Magazine

The Terrible Power of Memory Manipulation

If there ever was a deadlier power over humans, memory manipulation stands alone as a deeply personal one. 

Many famous serials and stories use memory as a premise for their plots and people, which is unsurprising: immediately there’s a plot device of mystery and intrigue surrounding the character – whether it be internal or external – and for the rest of the story we want to see why, what, and how it all happened – the genesis, the origin, the forbidden fruit, we’re hooked on piecing together what exactly is going on. 

Memory manipulation, of course, occurs to varying degrees. There’s the traditional all-out amnesia, which can be seen all the way back from old folk and fairy lore like princes who forgot their true beloved in stories collected by the Brothers Grimm; there’s classic science fiction elements of wiping out a personality and replacing it with another, as seen in Total Recall starring Arnold Scharzenegger (“If I am not me, then who the hell am I?!”); the selective erasure from a portion of memory, which drives Jack Harkness to pursue and find out why a previous organization did so to him; residual memories that are passed onto the next generation by unique means, like in The Giver; stories that deal with real world medical issues of neurodegenerative disorders, such as Alzheimer’s in Away from Her; and then the murky past that serves both as a beacon of light and a haunting, driving obsession to either resolve or run away from it, seen all too often in serials like Jason Bourne. 

Why are memory narratives so engaging? Foremost they are psychological: more than the physical actions at hand are the underlying pulsations of nerve signal, the undulating nuances of electrochemical messages spiking back and forth, to and from somas to axons to dendrites of neurons; yet beyond these neurobiological bounds there is something more science has yet (or if ever) to fully encompass and objectify what exactly composes the arena of irrational emotions – the enigma of psychology. 

We can never be sure if our memory is 100% accurate. Details are lost, omissions are consciously or subconsciously made, facts vary slightly, retellings and subsequent recounting dilutes the actual event more and more: it’s a very fickle component of our cognitive existence – essential, but fickle. Evolutionary, memory serves as a compilation of survival and social skills needed to get by – instinctual muscle memory and habitual memory, you could say. And while humans have evolved to exist on secondary resources (e.g. money) as a means to indirectly survive off primary resources (e.g. soil, water), thus leading to cultural and infrastructural development as we know it, memory still plays an integral part of our daily lives. Whether it’s habitually checking your car mirrors,  playing a piano piece without sheet music or even balancing on your bike – memory is all over the place, instinctually and habitually so. 

Memory is more nuanced than instinctual and habitual tendencies. There are instances we remember for various reasons. From what your boss told you this morning (“Do you still have my stapler?”) to how beautiful your spouse looks on the eve of your honeymoon, what we choose to remember is essential to how our lives function, and invariably these memories – regardless of intention or purpose – are driven by deeply personal reasons. After awhile, the truest aspect of memories is the emotion associated with them – emotions of love, hate, happiness, pain, joy, sorrow, wonder, trauma, all of it. In this sense, memories are incredibly raw in the undulating, hidden reservoirs of our cognitive conscious. This is why memory manipulation is such a engaging and haunting narrative premise to play off of. 

Take for example Steven Spielberg’s Minority Report. John Anderton, upon discovering a glitch in the pre-cog system, soon learns that he will kill a man he has never met. Despite great efforts to evade a highly computerized (optimistically futuristic) society, Anderton eventually finds the room of the alleged victim, Leo Crow. 

This scene is crucial: Anderton sees the man’s bed covered with pictures of children, one of which is of the man with his missing (and likely deceased) son, Sean. Upon seeing these photos Anderton completely breaks down, and when Crow returns into the room Anderton violently rushes at and brutally beats down Crow in a fiery rage and passion. 

What’s noticeable about this climatic scene is that up until now, Anderton methodically and cooly found clues to who could possibly be framing him. Yes, there are moments of action, but nothing compared to the almost bestial fury he unleashes when he’s led to believe Crow was the pedophile who kidnapped (and probably killed) his only son years ago.  And despite the orgy of evidence evident upon the scene (as Danny Witwer later determines, “this [was] a set up”), Anderton lets go of logic to act upon his primal emotions of anger and pain, to unleash upon this alleged man all the emotional scars that never found solace or closure for all these years. Haunted by guilt and long, long episodes of desperation and disillusionment, Anderton holds his memory of Sean so closely to his heart that in a moment of weakness, he loses all his cool and nearly fulfills his own predicted destiny. 

Anderton’s reaction results from an external and intricate manipulation of a memory deeply personal and painful. It’s a very low blow, but considering the perpetrator accomplished other personal goals prior and after it’s not surprise they used such effective and cruel psychological puppeteering. His emotional response is a powerful one not because of what happened, but because it is raw and unrefined beyond any measure of objectivity. 

While Minority Report explored the consequences of a memory past, Christopher Nolan’s breakthrough Memento explored individual fragments of memory leading up to the final consequence. 

Structurally, Memento is one of a kind: it begins with a man being shot, and then a backwards (and eventually forwards) progression to where it all began, where we the viewer finally piece together what has happened to the character of Leonard Shelby, who suffers from anterograde amnesia (the loss of ability to create new memories after the event which caused amnesia due to damage to the hippocampus or surrounding cortices). 

Shelby is haunted by memories of his wife, who he believed was killed by the same burglars who caused him to suffer from retrograde amnesia. He gets by by taking pictures of people and scenery, and then writing himself notes about what he feels or knows about the subjects in the moment he can still recall these feelings or knowledge; that way, he hopes, he progress forward and not start over again from scratch. Lastly, the most important information that he can absolutely never, ever forget – he tattoos them on himself. Fool proof, right?

Wrong. As we can see from an outsider’s objective lens, the people around him manipulate Shelby’s artificial memory mechanism for their own exploits: Shelby’s landlord charges him for two rooms while he only occupies one; Natalie uses him to drive out a man Dodd from town; and Teddy uses Shelby’s investigative vigor to track down his own set of criminals (or so he says). 

There’s a lot of discussion about the fabula and sujet of Memento, but for writing purposes I won’t discuss them here right now. Instead, I believe the implications of Nolan’s breakthrough indie speaks volumes for a few reasons: 

As I’ve said before, memory is not 100% valid: it is a figment of collected information stored in our brains, clipped and edited to our needs and liking. Even if we document the actual events via writing, photography or film, there is always the question of who’s perspective these forms of documenting come from, and whether or not they capture enough of the whole event to merit factual validity. We don’t necessarily need these types of documents to remember an event, but they very much help us remember certain details and emotions that would otherwise be lost to the crevices of cognition. In Shelby’s case, he takes polaroid pictures because he needs to write down his thought process immediately: he relies on a artificial means of memory building, and though it is quick it is no where near the processing speed of the human brain. This limited time frame is just enough for people to take advantage of him for their own needs – and by extension, this time frame is the same time for something to seep in and “tamper” our documenting process. 

Minority Report masterfully combines film noir aesthetics with chique science fiction elements, highlighting a very classic form of memory manipulation – the haunting effect that drives the protagonist to act the way they do, an echo from the past leading to the ultimate conclusion. Memento, on the polar opposite fold, is a generously unconventional film that explores memory manipulation in the opposite way, where we know the conclusion but not the beginning, the echo from the past. Both films were released around the same timeframe (Nolan in 2000 and Spielberg in 2002), so it’s especially interesting to see how two films that explore memory resonate and diverge so much from one another. 

Memory is a very intricate arena of the mind, and any tampering of it invariably violates our own identity. At its core, memory manipulation is incredibly intriguing, terrifying, and deeply emotional – enough to make it a terrible power to have over another. 

Additional Recommended Reading

Is There a Minority Report? (or What is Subjectivity?) – by Matthew Sharpe, PhD in Philosophy from the University of Melbourne.