Executive Poverty
by David Alan Goodman, Ph.D.


How then can physicians trained as biologists declare in classrooms to students and teachers who are survivors of a culling process lasting for hundreds of millions of years tell them that twenty percent of them are mentally ill?

Parents being vulnerable to figures representing authority can be tempted to take their child to a certified physician capable of returning the child back under control, albeit the restraint is chemical (pharmaceutical).

Physician diagnostic manuals can target many students considered symptomatic because they are defiant, refuse to play with other students, blurt out insulting classroom comments and then respond inappropriately to impending threats of punishment by the teacher. Bipolar disorder in boys and girls as well as hyperactivity and wandering attention have been diagnosed in classrooms where prefrontal lobes equipped to properly inhibit moods and activity deemed inappropriate can be years away.

Parents confronted by authority figures drugging their child naturally respond by showing grief. To them having a child deficient in executive function can cause sadness and depression. At this time of personal need, the executive-trained physician can step forward offering prescription medication to the grieving parent. Most often the physician hands the parent a box filled with encapsulated medication from the family of drugs called SSRIs.


Between 1964 and 1968 multiple long-term trends were reversed. Human prefrontal lobes once called "the organ of civilization" were subjected to a quarter-turn converting the cerebral cortex into paired hemispheres called complementary. Almost simultaneously absent prefrontal dominance was replaced by the version of higher nervous system activity named "executive function." Following widespread implementation, executive function became the benchmark for science researchers and managers sharing policies and plans while implementing at their facilities projects and programs. During the four decades of acceptance, executives were able to dramatically reverse the trends against permitting juveniles and adolescents access to mind-altering drugs. Executive function and drug promotion became hallmarks for the prevailing paradigm. Now, more and more creative scientists worldwide after forty years propose a second brain revolution, this one restoring the human prefrontal lobes to former prominence. By the mere process of reversing the previous quarter-turn, a new paradigm emerges early next decade with the potential to restore greater sanity to civilization, better defend the children from potentially dangerous drugs, and make possible the supplementing of executive hegemony by having mature parents and adolescents provide their visions for the future.

The human prefrontal lobes for more than sixty years were recognized as being the region of the cerebral cortex for higher nervous system activity. North American and Western European scholars concurred with their Eastern European colleagues that the progressive maturation of the prefrontal lobes was responsible for the "high" powers capable of elevating the cumulative wisdom of the human species. Their shared neurological interpretation represented uncommon agreement during times of political turmoil. In addition, this was the era in human history when international scientists agreed that the higher nervous system activity was especially vulnerable to narcotics during the juvenile and adolescent years. Their agreement meant that in a major American city as recently as 1960, only one of 30,000 students in junior and senior high school had used a narcotic.

The teaching that higher powers of the brain were inversely related to narcotics use was to persist as the cultural norm until later in the 1960s when popular teachings about the powers of the left and right side of the brain being separate and equal began the nullification of previous neurological teachings about hierarchical brain function. The aim of R. W. Sperry, originator of the "split brain" reported in 1964 had been to educate scientists to the importance of the nondominant cerebral hemisphere. Nonetheless, within four years, a coordinated effort was sufficient to educate millions of juveniles and adolescents to the new brain model, now split conveniently into the left side of the brain leading the life of social conformity while the right side enjoyed an innovative lifestyle dedicated to the sampling of varieties of psychoactive drugs.

The familiar right brain-left brain model that replaced the previous teachings of the human cerebral cortex being the site for higher nervous system activity became known as a brain revolution representing the one-quarter turn of the cerebral axis, shifting the definition of the split brain from anterior-posterior to left-right. The practical result was a decline in scientific publications about how the prefrontal lobe powers protect against drug use. The more the beliefs in higher powers dwindled, the more the new generation of juveniles and adolescents adopted the belief that "high" was synonymous with intoxication. While the youth extolled the chemically induced highs, almost simultaneously there grew through the later 1970s scientific interest in the extreme anterior end of the central nervous system being the site for what the scientists called "executive function."

In writing this article the author reviews how the brain revolution establishing dominance of executive function changed world society in two essential ways: First, it introduced the idea of getting high by juveniles and adolescents eventually becoming their birthright. And second the rapid escalation in the number of the executives, the science managers having interchangeable roles, led to science policies and plans worldwide, that permitted them to declare an epidemic of mental illness followed by the imperative to administer to children and their parents prescription drugs.

The predictable result since the late 1970s has been the plethora of college graduates perceiving disease and drugs being the future of science have enthusiastically embraced executive careers.

Nonetheless, a growing number of thoughtful scientists especially now are beginning to ask why the strange phrase "executive function" has been used to describe the highest abilities of the human central nervous system. In the popular mind the word "executive" refers to the business world and its imperative that leadership establish policies and plans and hire the managers who can better strategize to move their commodities to new markets, having the purpose to increase corporate profits. Since profits from drug use can be the major engine energizing the worlds of economics and politics, some independent scholars and scientists leaving the mainstream have pointed out how in forty years since the birth of the obsession for drugging children, civilization might best survive by devising more people-friendly new paradigms.

According to Thomas Kuhn, the eminent philosopher of science, the prevailing paradigm demands like-minded persons in power expecting to expand their reach and prevail by defining their scientific paradigm as "normal" thereby rendering them more impervious to assaults by adversaries. They can defend their common interests effectively until as Kuhn pointed out, they are confronted by an increasing number of "anomalies." These are the difficulties in applying their standard model to explain apparent contradictions arising. Kuhn provides multiple examples of the normal paradigms overthrown to be replaced by newer, revolutionary paradigms. They include the Copernican Revolution and the Darwinian Revolution, and Kuhn speculates on how in the future there can be many more.

Now, thirty years since the Brain Revolution of the 1970s establishing the popular prevailing paradigm based on executive function, complementary hemispheres, and the championing of drug use, philosophers of science are asking whether the current executive-based paradigm inevitably will survive into the far future. The answer to this question must be: "No." Like the purloined letter of Poe, the anomalies have been hiding in plain sight. Three anomalies will be identified briefly with the fourth being discussed in greater detail. Intelligent discourse requires that we detect these anomalies that then can be seen clearly -- like the purloined letter hidden in the letter box.

Anomaly One: Drugging Evolution

Since Darwin, intelligent scientists have taught how variation, selection and then isolation are critical for evolution. During hundreds of millions of years variation in the vertebrate genome has played a major role. These genetic alterations displayed as differential phenotypes provide the raw material for selection. Natural selection assisted by isolation produces a population better able to survive in diverse environments. These evolutionary teachings are taught by biologists who are also executives managing science today. Accepting variation and selection as paramount and the concept of biological fitness being imperative how can physicians trained as biologists declare in classrooms to students and teachers who are survivors of a culling process lasting for hundreds of millions of years tell them that twenty percent of them are mentally ill?

Many critics of drugs in the classroom and medical politics doubt that twenty percent number, being the number of students and teachers potentially classified as being mentally unfit, therefore losers in the process of culling that has lasted hundreds of millions of years. The claim by physicians hired by the schools that students different from the executive standards are considered to be ill requiring medication can hardly be more removed from the teachings of vertebrate evolution. For this reason, the first anomaly working against longevity for the prevailing paradigm is that widespread mental illness in the classroom relates less to vertebrate evolution depending on normal variation, and more with high-level decision-makers drugging children and teachers departing from the executive norms.

Anomaly Two: Drugging Children

Distraught parents can easily be persuaded how the wild unfocused energy of their defiant child qualifies him to be called mentally ill. Parents being vulnerable to figures representing authority can be tempted to take their child to a certified physician capable of returning the child back under control, albeit the restraint is chemical (pharmaceutical). That the child must be declared mentally ill to receive the medication makes perfectly good sense to parents who are on the verge of rear.

What makes little sense to most teachers of evolutionary biology is how physicians evaluating the children as young as age six or even younger, can diagnose the littlest citizens with mentally illness because they lack proper executive function. Can physicians really fail to understand how the prefrontal lobes in six year olds while growing steadily since about age two, can scarcely inhibit "excess" physical activity before about age eight or nine? Going further, it is not just immature little boys who are mentally ill. Physician diagnostic manuals can target many students considered symptomatic because they are defiant, refuse to play with other students, blurt out insulting classroom comments and then respond inappropriately to impending threats of punishment by the teacher. Bipolar disorder in boys and girls as well as hyperactivity and wandering attention have been diagnosed in classrooms where prefrontal lobes equipped to properly inhibit moods and activity deemed inappropriate can be years away.

Recently, the little boys diagnosed with hyperactivity and wandering attention, little girls have joined them being diagnosed with bipolar disorder even though the growth of the prefrontal lobe tissue capable of inhibiting "inappropriate" moods may not mature until about age eight or nine. One report released in 2007 surveying the war against mental illness in the schools has reported how the diagnosis of bipolar disorder in the little boys and girls did increase 40-fold during the previous ten years. This unprecedented explosion of the diagnosis linking childlike behaviors to manic-depressive psychosis must be considered the second anomaly detected against the prevailing paradigm.

Anomaly Three: Drugging Parents

Parents confronted by authority figures drugging their child naturally respond by showing grief. To them having a child called deficient in executive function can cause sadness and depression. At this time of personal need, the executive-trained physician can step forward offering prescription medication to the grieving parent. Most often the physician hands the parent a box filled with encapsulated medication from the family of drugs called SSRIs. Since 1988 when the first of them, Prozac, arrived these selective serotonin uptake inhibitors are the chief chemical weapons against grieving, sadness and depression. This class of drugs is promoted as clearing the mind, focusing attention, blocking out negativity, and promoting the feelings of happiness.

What the physician fails to mention is an important major side-effect of the SSRIs. This is not just the weight gain or numbed emotions, sexual side effects or suicidal thoughts. Rather the major side effect of the typical SSRI is depletion over time of the critical brain transmitter dopamine responsible for human executive function. In previous studies, chronic dosing of adults with an SSRI can routinely reduce the release of the neurotransmitter dopamine by 30 percent and often substantially higher. Although the conclusions are not always clear cut, serotonin suppressing negative emotions and dopamine responsible for executive function tend to be opponents. Elevating the one ends to diminish the other.

In this way, grief-stricken parents imagining the loss of their plans and goals set for their child's independence, having negative thoughts can be placed on drugs. The end result is often drugged parents meeting in support groups talking about their drugged children, their unique problem-solving abilities held in check by mental health professionals. Compliance with the belief that the parents can no longer be executives organization their child's future represents the third anomaly accruing to burden the future of the prevailing executive paradigm.

Summarizing, anomalies three, two and one, the obvious conclusion is intelligent design in the classroom: how many children lack executive function and how willing parents are to forego executive function defines what mental health is. Children only acquire the core executive function during their prefrontal lobes growth spurt between the ages of eight and fourteen, but apparently this means only that at age six they can be diagnosed as overactive, wandering in attention and sick with bipolar disorder. Parents who would weep for loss of their child may then be administered drugs to block their emotions. This is the world of today when physicians awaken mornings, travel to schools, and don their monastic white smocks believing that diagnosing and dosing children and parents represents their only possible future.


Despite the endemic confidence in paradigmatic perpetuity by physicians, a potential specter looms, a coming revolution, permitting children and parents to throw off their molecular chains imposed by drugs. This impending fourth anomaly is likely to become vividly clear during three or four years. This is the challenger to medical hegemony and its name is called Chronomics. It is the science resulting from decoding the human genome in 1998 painting a remarkable new picture of the prefrontal lobes than as the receptor site for the drugs treating mental illness.

Anomaly Four: Mental Chronomics Arrives

The decade since 1998 has been the era for discovery of the human genome and the identification of "clock genes" residing at the core of the new science of Chronomics. To many physicians comfortable with professional status, they do not know what to think about "clock genes." They know a bit about circadian rhythms and sleep disorders and jet lag and seasonal affective disorders; they medicate these as diseases. They may have read in 2003 the article "Time for Chronomics?" but likely this means less to them than improving their golf swing.

But they are wrong. Chronomics inevitably opens Psychiatry to a new world of the human mind so unexpected that it is scarcely distinguishable from magic. Briefly stated, Chronomics emerges following hundreds of millions of years of vertebrate evolution under the periodic shifts in energies linked to the periodic movement of the sun and moon across the heavens. Energies originally linked to periodic motion become embedded in the DNA of cells including neurons in the human brain. Clock genes named "timeless" and "period" synthesize chemical messengers capable of biasing nerve cell membranes in extensive regions of the brain. This enhances cyclicity in the humans triggering sleep-wake rhythm, and in women during their reproductive years their monthly emotional rhythm.

What Mental Chronomics teaches is the actions of clock genes on four primary neurotransmitters in the brain stem named serotonin, dopamine, acetylcholine and norepinephrine. The result according to contemporary teachings is the presence of a mental cycle thought to last about a month in adult women and in men eliciting a predictable series of moods and dreams based on the dominance in a predictable sequence of these modulators of the emotions. Women acknowledge that about the time their menses cease, they experience four emotional phases: from feeling calm and contented during first phase, feeling congenial and executive during the second phase, fatigued and depressed during the third phase, then irritable and tense during fourth phase. These cycling phases are important because proponents of Mental Chronomics claim that the emotional cycle persists in women beyond their reproductive years and in healthy adult men.

Periodic activation in a dynamic sequence of primary neuromodulators during the biological month means that routinely men and women experience two weeks of positive moods linked to serotonin and dopamine, and two weeks of negative moods linked to acetylcholine and norepinephrine. Managers of research science have labored for more than a half century to create a static system where patients feel calm and contented and then happy during every day of the biological month. While appealing to perceived need, the philosophy of perpetually feeling positive flies in the face of thirty years of research that identifies in most, perhaps all, men and women in their dreams two weeks of calm and happy emotions followed by two weeks of dreams accompanied by sad and tense emotions.

Why this has not been discovered sooner in men can be linked to recent discoveries, among them: The biological month in some men may be little more than a day, or it can as long as about nine calendar months. The biological cycle may not be strictly sinusoidal as believed, bit may be chaotic and unstable shifting in frequency and amplitude. The most surprising discovery may be that the cycle may be both complex and unique in every individual. These conclusions emerge from 20 years nightly tracking of 22,000 annotated and timed dreams, analysis of dream series in Hobson's "Engine Man," and in numerous other dream series from published books and in downloads from the Internet. Dream analysis thus far in every case, allowing for individual differences has contained the periodic tetrad in the same sequence.


Thomas Kuhn's followers in science especially neuroscience probably can detect in the four anomalies, a new paradigm capable of challenging the brain model accepted as canon by the chief executives. The fourth anomaly in particular raises strong questions about why the prevalence of mood disorders especially among the children whose brains are just beginning to acquire the adult rhythms. Why does the static brain model prevail when modern cognitive science teaches the prevalence of dynamic distributed networks linked to shifting chemical transmitters and interpreted in the prefrontal lobes? These questions and those related to unsophisticated diagnostics and executives serving the needs of medicine and manufacturing.

The simplest and most credible solution to these problems can be as simple as the quarter-turn of the human brain along its major axis returning prefrontal lobes to their position of dominance they enjoyed before the drug revolution breaking free during mid 1960s. To a growing chorus of scientists, the time has come for a new post-executive brain model capable of raising anew questions about the ubiquity of diagnosis and dreams. Credibility for the next brain revolutions beyond executive function can be dictated as well by the abundance of discoveries in prefrontal brain function during the past ten years, as seen in Table One.


A. The Heavens Descend
As should be clear now, the author in 2004 in a journal article assuming that clock genes reflecting vertebrate evolution beneath the heavens during hundreds of millions of years are responsible for periodic activation of nerve cell clusters in the primitive brain stem secreting serotonin, dopamine, acetylcholine and norepinephrine. The results being in part hypothetical can be valuable in bringing down to earth the basic idea that human beings best foretell the future by previewing possibilities during the month of what may be the best and worst things that can happen to them. Then through activation of tissue at the extreme anterior end of the prefrontal lobes choose from among alternative mental states the life they prefer to lead.

B. Evolved Women
Men who are science managers have promoted executive function mirroring how they think. How could they still justify their decision to the women whose prefrontal lobes gray matter is proportionately greater, neurons more densely packed, and whose white matter more readily connects sides of the brain? When scientists research "higher" powers, they will be wise to study women who are more moody, tearful and indecisive. After all, men and women share the cyclicity; tears of anguish can heal the body; and what men call indecision may be tempered restraint based on foresight enable them to avoid rushing off to declare endless wars. Women's brains are also more verbally fluent and environmentally sensitive than men's brains.

C. Right Prefrontal Lobe
The idea that the right side of the brain evolved to be a receptor site for drugs and that it was the depressed side of the brain now seems frivolous on the one hand and atavistic on the other. The right prefrontal lobe counsels caution against making hasty decisions to rush down the wrong path. This side of the prefrontal lobes can be the site for emotional and social intelligence that in many executives is secondary to acquiring strategic skills required for personal and corporate gain. The right prefrontal lobe compared to the left can be more gracious, better sense nuance, add intellectual spice to life, and most of all, embrace any new paradigm that to be implemented requires happy children, proud parents comfortable with new ideas entering through the right front side of their brain.

D. Lofty Decision Maker
The topmost decision-makers in 1950 believed that the "highest" nervous system activity occupied the anterior pole tissue just behind the forehead. Anterior pole tissue in humans was thought to be larger and better connected. It qualified as the most likely candidate site for the integration of human abilities. Evidence accumulating the past decade establishes this region of the human brain as being the best connected to select among primary emotional states choosing the preferred one. It is also the region of the brain most able to suppress emotional memories disruptive to locking on to long-term romantic and career goals to be achieved despite obstacles.

E. Authentic Genius of the Human Brain
How many authentic geniuses from the past inadvertently might have been placed on pharmaceutical chemicals by psychiatrists? Evidence accumulating during the past ten years identifies the extraordinarily creative persons having the uncanny knack to detect and reject scripts, schemes, plans and habits implanted in them by others. They are unique in being able to operate for long times appreciably above the executive level. They can intrepidly endure on their projects despite lack of support. Among geniuses in science, we remember how Newton, Darwin, Mendel and Einstein worked intrepidly towards long-term goals while remaining distant from university, peers, grants and graduate students.

With a revolution of the brain comprising a quarter-turn back to the paradigm prevalent fifty years ago made more provocative by recent discoveries in cosmic, womanly, right prefrontal, anterior pole and genius function, we ask reasonably how can it be easily introduced into society for the past fifty years dominated by executives and drugs? The best answer comes from the writings of H. G. Wells, perhaps the greatest futurist who ever lived. Wells, after a lifetime spent writing more than100 books and 500 major articles concluded that the professionals meeting in conclaves and congresses rarely can be the best planners for the future. Rather, during the final year of his life, in "Mind at the End of its Tether" Wells introduced a provocative new idea appropriate for a report on executive poverty. Wells proposed that instead of experts sharing conference tables to plan the future, why not bring together wise parents and their mature adolescents assisted by the best teachers of ecology and forecasting to ponder and plan together the best human future?

Get together the wise parents and the mature adolescents with futurists and ecologists, and then ask them to plan the rest of their lives. They will likely reach inside to balance their executive ability with their sense of humanity along with their hidden genius to resist outside pressures, and they can have at least a fighting chance to conjure the world of the future dominated by the prefrontal lobes powers bypassing the executives running the world now. What can be expected by the best of the youth and their parents is presented as a preview of the future in Table Two.


Suppose independent educators for ecology and foresight converge on a roomful of adolescents set to ponder the greening of the executive future. They acknowledge the reality of academic disciplines like Chronomics, Psychiatry and Gynecology. When asked how to green these professions, one student replies within a few minutes with: Chronomics, Psychiatry and Gynecology. "Right on," the forecaster and the ecologist reply.


David Alan Goodman is Founding Director of the Newport Neuroscience Center in Southern California. His personal vision has been to restore prefrontal lobes to educational preeminence. His major research interests are the biological and biochemical bases for Mental Chronomics. Among his innovations are recording physiological function at a distance, and how cycling emotions, honest tears and indecision rank as higher powers in the brain in mature men. He is also author of books on how extended drug use subtracts from prefrontal lobe abilities.

Author's Notes:

1. Eastern European research trends in higher nervous system activity in the 1950s centered on prefrontal lobes as source for on foresight, active imagination, holos (similar to social intelligence) and sophisticated analysis.

Advanced research there and in North America were the subject of a University of Chicago biological psychology course offered by Ward C. Halstead taken by the author taken in 1962.

2. The imaginative writings of L. J. West, UCLA psychiatrist between 1964 and 1968 laid the groundwork for the paradigm for the permissive society where youth were expected to experiment with recreational drugs considered to be a civil right; also to receive university training to join science bureaucracies having global reach. It was popularized as The Brain Revolution and as Consciousness Three in popular books by M. Ferguson and C. Reich.

3. The earliest teachings on what he called "Biochemical Individuality" were by R.J. Williams who vividly differentiated between normal variations in an interbreeding population creating the substrate for human evolution. This emphasis on disparate genomes served to educate to how evolution creates differences that argue against believing a universal norms governing thought and emotion.

4. Drugging children was strongly discouraged culturally prior to 1960. Sellers of tobacco and alcohol were forbidden from selling their products to those younger than age 21 whose brains were still growing. Social values worked so strongly against drugging children that corrupting their minds by distributing habit-forming drugs was limited to agents of international espionage weakening the morale of the civilian population.

5. Front-page stories in major publications on the release of Prozac, first of the SSRIs, a new class of antidepressants ushered in a multi-billion dollar new industry. Provocative news stories and popular books recruited millions into the new world of cosmetic consciousness. The allure of perpetual happiness caused millions to experiment with potent drugs capable of massively reorganizing patterns of neurotransmitter release. This step towards breeding social passivity along with outspoken ego can only be called fascinating for many of us who require a permit and an impact study to plant new trees on our properties because of potential environmental damage.

6. Harvard psychiatrist J. Glenmullen in 2000 gathered the statistics on the brain impacts of serotonin and dopamine being opponent. His figures on the drop in dopamine after authentic SSRIs could be as high as 54 percent. His title "Prozac Backlash" following P. Breggin's "Talking Back to Prozac" identified what was a massive reorganization of the pattern of neurotransmitter release in brain tissue following chronic SSRI administration.

7. Prozac Backlash is often observable after about six months. At this time in about four of ten patients thoughts intrude whether the drug use should be ended; the dose increased; or a second drug taken potentiating the SSRI.

The attending physician seems unaware of Goodman's Law -- that all drugs work in the short term, none work in the long-term.

8. The careful thinker cannot help wondering whether the occasional attacks by the FDA on the safety and efficacy of popular drugs like Prozac creating negative publicity and diminished use among adults almost inevitably opens the eyes to physicians to schoolchildren grazing in classrooms. These youth seem to provide at least in theory the next possible source replenished income, especially those children whose parents are immigrants from nations where authoritarian controls represent the cultural norm.

9. Mental Chronomics originates in a 1973 article in an Academic Press volume by the author following his year doing research for a pharmaceutical firm. He believed how before drugging millions with drugs boasting exotic names the intelligent scientist would be wise to discover what kind of instrument is the human brain. This led to collection of 30,000 dreams during 31 years during nightly journal keeping. Dream analysis in numerous volunteers, some contributing several hundred dreams, each subject revealing an individualized four-part cycle called "one's own drummer's drum."

10. Writer about the woman's prefrontal lobes and their higher powers is L. Brizendine. D. Goleman occupies the leading edge of writers on emotional and social intelligence, largely in the right prefrontal lobes. E. Goldberg has written "The Wisdom Paradox" to follow up on the earlier "The Executive Brain." N. Andreasen after decades of studying bipolar disorder now studies "The Creating Brain and the neuroscience of human genius. Could it be that she realized how different brains don't mean abnormal brains?

11. There is a virtual explosion of interest in the anterior pole cortex, being the most sophisticated association cortex located just behind the forehead. The research of I. Papousek and B. Depue have been establishing the nerve cells at in area A10 at the extreme anterior end being activated by task complexity, choice of preferred emotion, and higher-level coordination despite distractions prior to reaching long-term goals.

12. The key concept introduced to the scientific community is the serotonin to dopamine to acetylcholine to norepinephrine sequential activation. The first two modulators in that order have been observed towards mid-cycle in women during their reproductive years. Acetylcholine activation producing a mood relating to personal loss has been widely discussed; recently it has been suggested that these neurons are activated when an expected reward fails to materialize. The norepinephrine is associated with the phase tension and irritability leading to anger.

Questions arising: davegoodman@juno.com