Tuesday, June 28, 2011

Limitless: How can we unlock the full potential of the brain?



The trailer for the movie "Limitless" depicts a young writer who starts taking a drug that his drug dealer tells him will unlock the other 80% of his brain. The character, portrayed by Bradley Cooper, becomes fantastically intelligent and quickly wealthy and famous. Could this happen? Not so sure. A number of smart drugs, memory enhancers and “nootropic” supplements have been shown to slightly improve memory, motivation, attention and concentration. Vitamins B and C, folate, Omega-3, Ginkgo biloba, other herbals and proprietary blends of these like “Focus Factor” seem to help cognitive function. Central nervous system stimulants like caffeine, Adderall and Ritalin are more powerful, and have proven effects on mental ability, but it is not completely clear how any of these increase mental through-put. Some increase the effects of certain neurochemicals, some dilate blood vessels, some stimulate nerve growth, and most cause neurons to work harder. Simply put, most neurotropic drugs give the brain more energy to power its searches for appropriate memories. Does this mean that the brain is being used more, or that more of the brain is being used?

Is it true that humans only use 20% of their brain? No one can be sure because there is actually no way of knowing quite what this statement is trying to say. It is clearly not true that large portions of our cortex remain unused. If you watch the video results of an fMRI scan you will see that as a person does various tasks and is exposed to various stimuli, virtually the entire cortical surface will light up at one time or another. It is true that at any one time, only a small fraction of the neurons in the brain are being used maximally. But this is a good thing because the unused neurons would insert inappropriate and superfluous content that would be distracting and would take away from the specificity of thought.

It is helpful that we only use a limited percentage of our brains at a time. The brain has dozens of dedicated processing areas, each fine-tuned to solve particular problems, most of which would be irrelevant to a specific task at hand. In fact, much of the brain is designed around quieting or inhibiting extraneous stimuli so that only the most pertinent things can make it into consciousness. Your mind is able to do its work without distractions because your brain is constantly suppressing activation in areas that may seem to hold pertinent memories, but have proven in the past to be misleading. Arguing that we would be better off using 100% of our brain is like arguing that it is better to use each of the tools in a Swiss Army knife, at the same time, for the same project, all at once.

Brain researchers like Karl Lashley, in the 1950s were confused by certain experimental results and were falsely led to assume that the entire cortex of the brain was undifferentiated and that individual areas do not specialized in specific tasks or activities. Outdated principles, such as “mass action” and “equipotentiality” conceptualized the cortex as a homogenous pool of neurons where function could not be localized. Lashley and other neuroscientists of his day saw the individual tools of the brain's Swiss Army knife as interchangeable and able to be summed together. These conceptualizations were wrong but probably contributed greatly to the 10% myth.  In fact, it would follow logically from his principles that if specific brain areas do not have particular jobs and if we only use a small minority of brain cells at any time, that we could increase mental ability simply by increasing the number of active areas. Too bad it isn't that easy.

Activating inappropriate memories will not increase intelligence, but how about increasing the span of activity of the most relevant memories? As we think, we hold and let go of certain memories. If we could increase the time that certain helpful memories are activated and available to working memory then this, I believe, would increase intelligence. The brain area to target in order to do this would be the prefrontal cortex. I think that the movie Limitless, does an amazing job of portraying the effects of prolonged prefrontal activation or “hyperfrontality.”

The best kind of neurotropic drug that I can imagine would increase activity in the lateral prefrontal cortex. This would make it so that one was able to drag even more of their visual and verbal imagery with them through time. The PFC is like a switchboard with contacts in all kinds of other brain areas. The harder the PFC works, the longer various representations and subroutines can be maintained. In some senses, this WOULD allow someone to use more of their brain. Using time and mental resources to recall deactivated memories is like trying to use your fingernails to pry out the implements of a Swiss Army knife. Increasing PFC activity though, would be akin to having the most recently used tools in the knife unsheathed and ready for implementation.

Monday, June 20, 2011

Stress Primes You for Negative Thinking


I watched a great animated movie last night, "All Star: Superman." In it, Lois Lane visits Superman at his fortress of solitude for the first time but disappointingly she starts acting bizarre. She becomes highly nervous and begins to assume that Superman brought her there to experiment on her. After planning frantically to defend herself, she grabs a kryptonite laser from his arsenal and blasts him with it.
It turns out that she was exposed to some chemicals that increased activity in her amygdala. At that point, it all made sense to me. Even though Lois Lane is kind and thoughtful, she has the potential to become paranoid when severely stressed. She didn't have a good reason to suspect Superman of foul play, but when our amygdala is activated, we often trust it unquestioningly. We do this because it is so often right. The amygdala has a mind of its own. It unconsciously listens to many other brain areas and takes cues from the environment about when to be scared. We accept its messages as a type of foreboading intuition.
When activity in the amygdala increases, and the adrenal glands begin to release adrenaline and cortisol, the brain becomes primed for negative thinking. It is like your brain is retuned to perceive things as troublesome or upsetting. This is the opposite of a manic episode where someone with mania might perceive everything as a happy, lucky coincidence. A friend of mine who has experienced mania told me that for two days it felt like all of the cars of the freeway moved to let him through, like everyone was agreeable and like everything was going his way. When I start to feel that everything is going poorly I try to remember this - that neurochemicals can paint over reality.
I have noticed recently, that if one thing stresses me out, I am much more likely to get stressed out about other, completely unrelated things. I might get upset about an unfortunate circumstance and then wear cynical glasses for a full hour afterwards. One could say that this "displaced" negative thinking is not logical. It may be evolutionarily logical to be prepared for the worst during bad times, but from a modern, practical perspective it is illogical to generalize anxiety to whatever your mind turns to.
Remember, cortisol is high in the morning, so don't give morning stressors the attention that they feel they should be given. Also, remember Lois Lane, and make sure that one unfortunate circumstance doesn't lead to a domino effect of paranoia. Nowadays, I try to notice when I carry negativity over, from one thought to another. When I can notice it I try to tell myself that the negativity may feel valid and intense but it is probably just residual and misattributed emotion.

Wednesday, June 15, 2011

A Reliable and Replicable Hallucination


Over the last six months I have come to notice that I experience a specific hallucination more frequently if I am stressed.

The sink in my bathroom makes a high-pitched humming sound. I think it actually vibrates at more than one frequency, creating a phenomenon known in acoustics as "beats." Anytime I turn it on, the sound seems to waver back and forth between two high notes, like the siren of a cop car.

Once, after a few days of intense stress, I was absolutely sure that the sound of the sink was actually the sound of my home alarm and I ran from the bathroom to the alarm console to turn it off. Of course, the console display showed me that there was no activity and I was alarmed to realize that I had experienced a very convincing hallucination. The following days, while washing my hands, I marveled at how I could actually mistake the sound of the faucet for an alarm. They did sound similar, but when not stressed, I felt like the two were distinctly different. I wondered about the neurological state that stress must have driven me to for me to miss the clear distinction in acoustic properties.

As you might have guessed, this misperception has occured over and over during the last few months. Anytime I am stressed, I begin to make this false interpretation and feel startled and momentarily fearful. The worse the stress, the more convinced I am that I am actually hearing the alarm. This phenomenon has, for me, underscored the reliable clinical relationship between stress, dopamine dysregulation and hallucinatory and delusional experiences.

Convincing Actors Trick Lower Brain Systems into Embellishing Their Performance

I think that the best actors are able to trick themselves into becoming emotional when they say their lines. Bad actors aren't convincing because they do not have the emotional thrust to come across as genuine. When we speak freely, the words that we string together into sentences have a neurological pressure behind them. Anytime that we are motivated to do something, dopamine surging from lower brain centers helps to guide our diction and compose our nonverbals. On a neuroscientific level, heartfelt speech has momentous subliminal force behind it.  If you are only using your frontal cortex to guide speech, it will not have this quality and will appear operantly counterfeit to anyone with a trained ear.
Sometimes I will say a line that I like to little effect and wonder why it fell flat. I think that to recreate great lines effectively, you can't just try to say the line the way you have heard a good actor say it. You actually have to be emotionally impelled by the words, you have to feel motivationally compelled to say them, as if, at that moment, there is nothing else you would rather say. Reward systems such as the mesolimbic and mesocortical dopamine pathways have to "think" that they are accomplishing something immediately useful for them to create the panoply of unconscious accentuations. When you act genuinely in this way, aspects of the delivery (such as the intonation, inflection and affect) emerge reflexively and unconsciously. I think that this is why it is hard to successfully and persuasively recreate another person's mannerisms, unless you are able to recreate their motives.
Acting involves coercing emotional centers of the brain to cooperate. These limbic regions were built to be turned on and off by “hot” emotional centers, not by our “cool,” conscious mind. They were never evolutionarily programmed to aid humans to manufacture premeditated action in drama or film. We have to recruit these regions deceptively - they have to believe the lines, even if we don't. Impulsive or opinionated people say exactly what they are thinking in front of others. Inhibited people, on the other hand, can come across as inauthentic because they often may not really mean what they say. They know what they are supposed to say, and they say it, but to most people it will sound like what it is - emotionally contrived and devoid of the spontaneity and sentiment that should accompany it. All my life, I have struggled to do a convincing job of being myself and I have found that the only way to do this is to premeditate less, extemporize more, and, quite simply, do what I want more often. This may be a dangerous mode of operation for a young child because a child's first reactions may be morally inexperienced or lacking in social graces. After a full childhood of experience with how to act, it seems that my disinhibited inclinations are finally socially permissible so perhaps I should finally give myself permission to operate off-the-cuff with less deliberation and more theatrics.

Thursday, June 2, 2011

Radio Interview & Press Release on Autism






- Press play above to hear an interview that I did with KFWB news radio about autism.

- This is the USC issued press release for the article:


Autism May Have Had Advantages in Human’s Hunter-Gatherer Past, Paper in Evolutionary Psychology Finds

Los Angeles, CA. A brain science researcher at the University of Southern California has concluded that the autism spectrum may represent, not disease, but an ancient way of life for a minority of ancestral humans. The article, published in the journal Evolutionary Psychology, proposes that some of the genes contributing to autism were selected and maintained because they facilitated solitary subsistence during prehistoric times.Individuals on the autism spectrum are described as having had the potential to be self-sufficient and capable foragers in environments marked by diminished social contact.In other words, these individuals, unlike neurotypical humans, would not have been obligately social and would have been predisposed toward taking up a relatively solitary lifestyle. In fact, certain psychological characteristics of autism are taken as a suite of cognitive adaptations that would have facilitated lone foraging.

People with autism have difficulty interacting socially, preferring to focus on narrow fields of interest. Reser speculates that, in the ancestral past, the penchant for obsessive, repetitive activities would have been focused by hunger and thirst towards the learning and refinement of hunting and gathering skills. Today, autistic children are fed by their parents and so hunger does not actuate or guide their interests and activities. Because modern children with autism are not able to forage or to watch their parents forage, and because they can obtain food free of effort, their interests are redirected toward salient, nonsocial activities; some classic examples being: stacking blocks, flipping light switches, lining toys up in rows, playing with running water, chasing vacuum cleaners, and collecting bottle tops.

A popular new perspective on autism, frequently referred to as the “autism advantage,” purports that unlike mental retardation, autism has compensating benefits including increased abilities for spatial intelligence, concentration and certain forms of attention and memory. A great deal of new research supports this line of reasoning by illustrating that although individuals have trouble with social cognition, their other cognitive abilities are largely intact. Until now though, very little attention has been given to how autism’s advantages may have played a role during human prehistory.

Reser compares the behavior of individuals on the autism spectrum with the behavior of other solitary foraging animals including orangutans and montane voles. Reser points out that like dogs, some animals are obligately social whereas others, like cats, can transition between social and solitary lifestyles. He emphasizes that individuals on the autism spectrum share a variety of behavioral traits with solitary species.  Both are low on measures of gregariousness, socialization, direct gazing, eye contact, facial expression, facial recognition, emotional engagement, affiliative need and other social behaviors. Despite the fact that solitary animals are low on these measures, they do not have difficulty learning the behaviors they need to survive. It may have been the same for individuals with autism.

In prehistoric times group size may have fluctuated greatly and inconsistencies in the ways that natural selection influenced the social abilities of humans may well be responsible for the large variation in social abilities seen in human populations.

“Conceptualizing the autism spectrum in terms of natural selection & behavioral ecology: The solitary forager hypothesis,” is published in Evolutionary Psychology and is available at:

http://www.epjournal.net/filestore/EP09207238.pdf

Eddie North-Hager / Associate Director of Media Relations /  University of Southern California

•••


- Below are some recent posts about the article, including one by Science Daily. The one at care2.com is very interesting to me. It is written by a doctor who is the mother of a child with autism and it makes several great points.