BY MANIE BOSMAN
“The gene was the central issue in biology in the 20th century. The mind is the essential issue for biology in the 21st century.” – Eric Kandel
At the risk of coming over as arrogant or self-conceited for saying this about an esteemed Nobel Prize winner, I suspect that Eric Kandel completely underestimates the impact that brain science would have in the 21st century. While it may very well be the major focus in biology, neuroscience is also completely transforming many of the social sciences and other areas of life as we know it.
The development of new technology and brain research methodology, especially functional brain imaging techniques such as functional Magnetic Resonance Imaging (fMRI), Positron Emission Tomography (PET), Electroencephalography (EEG) and Near-Infrared Spectroscopy (NIRS), is providing us with a constant flow of exhilarating insight into the workings of the human brain. This gives the social sciences such as psychology, education, cultural studies, linguistics, criminology, communication studies, economics, law, sociology and management a scientific perspective that it never had before. As these new insights are starting to trickle through into practice, some are prompting radical change and complete re-interpretation.
The Brain’s Day in Court
Take the law, for instance. A burning issue which needs to be addressed sooner than later is to determine what role neuroscience should play in the courtroom. Studies have shown that in juveniles the brain and the prefrontal cortex in particular, is still developing. This is the part of the brain responsible for our ‘working memory’ – handling conscious thought, which includes decision making, communication, remembering, discerning between right and wrong and inhibiting impulses. So knowing this, should courts be more lenient towardjuvenile offenders?
Other studies have shown that abnormalities in the brain’s amygdala (part of the limbic system that plays a role in emotional processing) and low levels of an enzyme called Monoamine oxidase A, also known as MAO-alpha, can lead to increased aggression. Should this be taken into consideration when dealing with violent offenders?
But there’s more. In 2007 a team of German neuroscientists performed an experiment in which they used fMRI scans to monitor brain activity in real time while participants had to choose to use either their left or right hand to press a button. They found that while the conscious choice to push the button was made just about one second before the button was pushed, participants’ brain activity patterns seemed to predict their decisions by as many as seven seconds. In other words, seven seconds before the participants became consciously aware of making the decision, their brains seems to have already decided for them. These and similar studies have sparked an intense debate on whether ‘free will’ even exists.
The Limits of Self Control
Neuroscientist Matt Lieberman published a study in 2009, which showed that one region of the brain – the right ventrolateral prefrontal cortex – is responsible for different forms of self control. What this means is that this particular part of the brain, which he calls the brain’s ‘breaking system’, is operating when we inhibit a physical impulse such as staring at a beautiful woman (motor self-control), suppressing a bad thought (cognitive self-control), managing our emotions (emotional self- control), refraining from buying that bargain just because it’s on sale (financial self -control), or refraining from ignoring the feelings and views of others (perspective-taking self-control).
It seems then that while technically speaking we might not have complete ‘free will’ in the traditional sense, we do have free ‘un-will’ – the ability to control the impulses or choices which our brains may make as the result of some subconscious biochemical process. However, the verdict on whether we’re always responsible for our choices and actions is not out yet – our brain’s ability to exert self-control can be severely weakened by several factors. These include neurochemical imbalances, neurological disorders, malnutrition, exhaustion, severe anxiety and even chronic stress.
So is the criminal offender with a neurochemical imbalance or suffering from a neural disease such as psychopathy accountable or not? Some believe it is too early to decide with neuroscience still in its formative years, but it won’t stay out of court and mainstream law for too long. This was confirmed by a recommendation in a 2011 Royal Society report titled “Neuroscience and the Law” which stated that “University law degrees should incorporate an introduction to the basic principles of how science is conducted and to key areas of science such as neuroscience and behavioural genetics, to strengthen lawyers’ capacity to assess the quality of new evidence”.
Selling with the Brain in Mind
Findings from neuroscience are also posing some interesting questions in economics and marketing. The emergence of new fields such as ‘neuroeconomics’ and ‘neuromarketing’ tap into neuroscience to understand what drives people to buy or sell and how this understanding can be utilized.
From neuroscientific research on decision making we know that that our decisions are much less rational than we tend to think – in fact, most of the decisions we make every day are strongly driven by the emotional centers in our brains. To better understand and ultimately influence consumer behaviour, research in neuromarketing aims to determine the underlying (often emotional) processes in economic decision making and brand preferences.
Some of the world’s foremost companies have already latched onto this and are investing millions in neuroscience research. As you read this some of these research findings are being used to construct targeted advertising campaigns, design new consumer products and create shopping environments intended to subliminally encourage buying. However, while some view neuromarketing as the start of a brand new era (pun intended) for the marketing industry, it is not without its critics.
A number of neuroscientists and sceptics have publicly rejected it as “pseudo-science”, while others are concerned for ethical reasons. One such critic, neuroaesthetics expert Anne Belden, published a paper in which she questions the ethical implications of neuromarketing, asking at what point is this emerging field still the acceptable application of scientific results and when does it become unethical manipulation of people’s minds in order to gain a profit? Only time will tell whether neuromarketing is going to be “the next big thing” in driving consumer behaviour, but we can be sure that if there’s money to be made, it will still be around for some time to come.
The Brain Goes to War
If there is one area in which the distinction between reality and science fiction becomes completely blurred, it is the application of neuroscience for war and military purposes. Headlines in Wired and other publications reporting on the topic read like something out of a cheap 1970’s spy-fi novel: “Air Force Wants Neuroweapons to Overwhelm Enemy Minds”; “Top Pentagon Scientists Fear Brain-Modified Foes”; “Neuroscience Could Mean Soldiers Control Weapons With Their Minds”; “Darpa Wants Remote Controls to Master Troop Minds”; “Nonlethal Weapons Could Target Brain, Mimic Schizophrenia”; and “Future Wars May be Fought by Synapses”;… .
The fact that one or two of these reports may be dodgy on the detail doesn’t for one moment mean that it’s not true or not at least based on some version of the truth. Just how serious the US government, among others, considers the potential military application of neuroscience is illustrated by the fact that the Defence Advanced Research Projects Agency, the Pentagon’s science agency commonly known as DARPA, was granted around US$240 million to fund neuroscientific research in 2011. In the same year the US Army spent US$55 million, the Navy US$34 million, and the Air Force US$24 million on research in this field.
According to a 2012 Royal Society report, “military interest in neuroscience has two main goals: performance enhancement, i.e. improving the efficiency of one’s own forces, and performance degradation, i.e. diminishing the performance of one’s enemy”. Nothing new there – strengthen your own and weaken the enemy, but what’s new is how it’s done.
Possibilities based on current and emerging technology include using neuroimaging and brain stimulation to screen and select soldiers for specific abilities; drugs to enhance endurance and keep combatants focused and alert; chemicals that shuts down enemy brains within minutes; drugs to make captives spill the beans during interrogation; drugs to prevent post traumatic stress disorder; combining human brains with computer programs through brain-machine interfaces; and intelligent brain-controlled drones. With the rate at which new information is churned up and new technology developed, the only limit for where ‘neurowarfare’ can go in future is probably the limits imposed by ethical conventions and agreements. Again, only time will tell if that will be enough.
While most of the world’s current and future conflicts result from competition for the same limited resources, many clashes are still caused by conscious or subconscious allegiance to different groups (e.g. racial, religious, tribal, ethnic, ideological, social, etc.). Neuroscience is providing us with some valuable insight with regard to inter-personal and inter-group interaction that might help create greater harmony amid diversity. We now begin to understand that what often manifests as racism or other forms of discrimination is largely our brains’ automated response to anything and anyone perceived as a possible threat or enemy (see my earlier post Your Racist Brain: The Neuroscience of Conditioned Racism).
Our brains’ primary function is to keep us alive, and on a neurological level, it simply does not respond the same to people it perceives as either 'enemies' or 'friends'. Perceived enemies (even harmless strangers) can trigger an automated fear-induced ‘fight or flight’ response for no other reason than being perceived as different from those we regard as our ‘ingroup’. While our brains seem eager to assign negative ‘labels’ based on appearance or background, several studies showed that this could in fact be modified – probably most effectively through individual exposure. Our conditioned fear response can be countered and even reversed through close, positive contact with those our brains have labelled as ‘enemies’. This not only provides new opportunities to address prevailing issues of racism and inter-group conflict, but also offers new prospects for cross-cultural interaction, diversity management and even team building.
No Dark Sarcasm in the Classroom
Hated maths in school? A recent study showed that anxiety about maths and social pain (e.g. humiliation) activates the same areas of the brain as physical pain. On the other hand, musical training during childhood actually shapes and changes the brain in positive ways which continue to benefit a person into adulthood. One of the most promising new fields to emerge over the last couple of years is that of ‘neuroeducation’ – a convergence of neuroscience, cognitive psychology and education theory. As we gain more understanding about the neurobiology of memory and learning we are able to adapt and design teaching methods that would reap maximum results.
Several studies using neuroimaging and measurement of neurotransmitters have shown that learners’ comfort level can influence the uptake and storage of information in the brain and that both children and adults learn better when they are happy. Under stressful situations information is prevented from entering the parts of the brain where learning and memorizing takes place. As educational neurologist July Willis describes it “when stress activates the brain’s affective filters, information flow to the higher cognitive networks is limited and the learning process grinds to a halt.”
Considering these and other findings about teaching methods and learning conditions, Willis and others actively campaign for greater incorporation of neuroscience in schools. She makes the case that “neuroimaging and neurochemical research support an education model in which stress and anxiety are not pervasive. This research suggests that superior learning takes place when classroom experiences are enjoyable and relevant to students’ lives, interests, and experiences”. Some schools in the USA and Australia have already started to adapt their teaching methods accordingly, but for now most are yet to see the light.
Leading to Engage Their Minds
For me personally, perhaps the most exciting contribution from neuroscience is the continuous stream of insight into the essentially social nature of the human brain. We now know that the same automated neural responses, which drive us towards food or away from danger, are triggered when we interact socially. In other words, whether we’re having a discussion with a colleague at work, attending a concert, or relaxing with friends, our behaviour is constantly triggering either ‘threat’ or ‘reward’ responses in each other’s brains.
For leaders and managers, the implications of this discovery are far reaching and has led to the emerging new field of neuroleadership. Our brains are basically social organs which react to social situations (and social pain in particular) in exactly the same way in which it reacts to the physical environment (and physical pain). It also means that the brain views the workplace primarily as a social environment, where it is constantly assessing social interaction as either threats or rewards (see my earlier posts Neuroleadership: How Your Brain Fights for Social Survival in the Workplace, SCARF: Lead in a Way That Will Engage People’s Minds and Proof From Neuroscience That Trusting People Makes Them More Trustworthy).
Our Brains on Social Media
This social nature of the human brain is also helping to explain the internet and social media frenzy of the last decade. Only China and India have larger populations than Facebook which is now heading towards one billion members. It turns out that our addiction to social media is largely about the neurochemical rewards we derive from it.
A recent study by neuroscientists from Harvard University revealed that talking about ourselves activates parts of the mesolimbic dopamine system, a region of the brain also associated with the sense of reward and pleasure we derive from food, money or sex. While in everyday life we talk about ourselves around 40% of the time, the average Facebook user devotes no less than 80% of their efforts to sharing their own thoughts and feelings, so that means double the rewards! As David Rock, author and neuroleadership consultant explains, “a sense of increasing status is one of the biggest drivers of reward in the brain… On Twitter, every time another person signs up to ‘follow’ you, you feel a little burst of reward that makes you want to post more.”
But neuroscience has also turned on the warning lights with regards to our ‘always switched on’ technology. Researchers at Stanford University discovered that overuse of technology and social media can actually reduce your intelligence! In a study involving student volunteers, they found that media multi-taskers are paying a ‘big mental price’. “They’re suckers for irrelevancy,” warned Clifford Nass, one of the researchers. “Everything distracts them.”
And There’s More…
There are many other areas in which neuroscience is prompting change and new possibilities, including healthcare, sport, medicine, treatment of mental illness and injuries,cognitive enhancement (improving intellectual performance), treatment of addictions,ethics, philosophy, change management, governance and policy making, communication, entertainment, art, architecture, coaching, relationships, stress management, politics and even agriculture. Obviously it would be impractical to discuss all of these in a blog (in fact if I wanted this article to reach a larger audience I should have stuck to about 360 words, considering indications that the internet might be rewiring our brains and reducing our attention span).
In 2001 Ray Kurzweil, renowned American futurist and inventor, wrote an essay in which he stated that “an analysis of the history of technology shows that technological change is exponential, contrary to the common-sense ‘intuitive linear’ view. So we won’t experience 100 years of progress in the 21st century – it will be more like 20,000 years of progress (at today’s rate).”
While there’s no doubt that communication technology will continue to be the main catalyst and medium for this change, neuroscience will also claim its stake. As far as science goes, it’s still early days for neuroscience, yet it is progressively moving us into a ‘brain new world’.