Saturday, April 30, 2011

This bodes ill for the future...

How Goldman Sachs Created the Food Crisis - By Frederick Kaufman | Foreign Policy

More and more I find myself worrying that here in the US we've effectively dismantled the financial regulatory systems put in place by the generations that fought World War II--both immediately prior to that conflict and in the years following its conclusion. That system brought us 70 years of economic stability, which stands in sharp contrast to the volatile industrial economy that preceded it. One that had spent over a third of it's time in depressions since it emerged here in the 1870s.

As collateral damage, deregulation also seems to have helped undo or imperil much of the international Post-War system of finance, currency, and trading that our grandparents and their parents built. That series of agreements and assurances was a major factor in eliminating war between the major economic powers as well as a general decline in the frequencies of war overall.

Under that system of trade and finance, it became much cheaper and easier to buy resources rather than seize them. Most importantly, both our domestic regulation of commodities trading and international trade agreements helped to moderate food prices both at home and abroad as an instrument of stability and peace.

And now it's gone.

In tandem with that is the dismantling of our manufacturing base in the US. Shipping factories and jobs overseas this past decade has left us in a situation in which we can no longer even reliably produce or obtain critical medicines .

As I continue to read more and more about the financial crises of 2008 and the three decades of deregulation that made it possible, I've grown more concerned about the coming decades. Looking back now with the benefit of hindsight, I find myself wondering just how many of the most reasonably optimistic scenarios for our future were bound up in that now unraveling world system built by our grandparents and great grandparents.

Friday, April 29, 2011

And now...Mt. Adams

Photos of the northern half of the pair of stratovolcanoes that straddle the Columbia River Gorge.



It's hard to get a sense of scale for Mt. Adams, even when in its presence. It's 11,000 foot sister, Mt. Hood, would fit completely inside, with plenty of volume to spare.



You almost have to see it from the air or look at it on a map to realize a it has a footprint that could contain a major metropolitan area or small Medieval European kingdom. The tallest peak in this photo is actually a false summit, set well forward of the highest point on the mountain, which is farther to the north.



Between the the storms and wet cold weather of this La Niña year, it's been a pain trying to line up a day when the mountain was actually visible. Even in a normal year this can be something of challenge here in the gray and misty Pacific Northwest. We can go for weeks during the spring without seeing any of these glacier-sheathed spires, which would otherwise dominate the horizons of both major and minor urban areas. And this spring has been a cold and dreary mess.

However, the meteorologist and ocean climate people are saying that the current La Niña thermal osculation in the Pacific Ocean is drawing to an end, and that temperatures should climb back into a normal range. A change that probably can not come soon enough for the inhabitants of the tornado-rampaged Midwest and southern states, this year. The low pressure created by that mass of shifted cold water off the west coast of South America is what's drawn the jet stream south to feed its energy into those amazingly violent storms.

While the month of May will probably just as bad, there's hope that next years tornado season will be milder without the current thermal configuration that's driving this one. And in the meanwhile, I'm certainly ready for some heat and warm weather. One of my favorite aspects of life in the Northwest are the seemingly infinite summer twilights spent walking around the warm city or the countryside, or on porches and rooftops with a book, a beer, or a glass of wine.

Monday, April 25, 2011

The genre vulgarity post

Well, perhaps less vulgar and more Not Suitable For Work, depending on your tastes.

So, thusly forewarned, it's Hugo Award season again in the science fiction literary community. While cynicism and despair look to continue being over-represented among critically well-received works, there are some bright spots as well genuinely humorous choices among the nominees.

Bright spot: Writers of the Future winner and Utah author Eric J. Stone's excellent SF novelette, "That Leviathan, Whom Thou Hast Made" is on the ballot. So, go Eric! That and congratulations for having made it to the final round with a great piece of writing.

Humorous: The infamous and hilarious video "F*ck Me Ray Bradbury" is up for the Best Dramatic Presentation - Short in the Hugos. Realizing that every blogger and their mother has already posted it, here it is again, in all it's off-color glory.




Because nothing says "I love you internet," like flogging a dead meme.


Thursday, April 21, 2011

The Business Rusch: Royalty Statements Update | Kristine Kathryn Rusch

Author Kristine Kathryn Rusch's eyebrow-raising article on how some of the major publishing houses may be underreporting sales of ebooks and print books to authors within the speculative fiction community.

The Business Rusch: Royalty Statements Update | Kristine Kathryn Rusch

This does make me feel better about my decision to focus on putting together a set of good quality works to digitally publish on Amazon.com and other online venues, rather than having spent the past six months fighting to land an agent and break into the world of traditional print. And I can't say that it comes as a complete surprise that publishers may be underreporting or have chosen to leave in place a dated system that by its design low balls figures to their advantage. Or that this is only coming to light now that authors can track and compare sales of their self-published digital works with ebooks put online by publishers. Or at a time when programs like BookScan are enabling authors independently track sales of their books within a number of venues, and then of course compare those figures with what publishers are reporting in their royalty statements. After all, this is an industry that by all accounts makes a routine business practice of holding onto payments for authors right up until just before it becomes possible to sue them in New York State courts--NYC being where most traditional print houses are headquartered.

To be clear, I am not gloating. Eventually, I'd like to have my works in hard copy form on the shelves of brick-and-mortar bookstores like my beloved Powell's in downtown Portland. If book publishers have been underreporting by a factor of ten, however, then there is a strong possibility that coming lawsuits will force publishers to put a freeze on purchasing new novels, and may even see several already struggling large houses go out of business.

While do I think that ebooks will account for the majority of book sales in the very near future, I don't seem them supplanting hard copies entirely. So far the best analogy of the many put forward to date that I've heard is the comparison of the emergence of ebooks and the ereaders like Kindle that have made them popular to the introduction of cheap, mass market paperbacks/pocket books starting in the late 1930s. The paperback did not kill the hardback. Instead it created a new parallel market for writers who could not break into hardcovers, and in introducing a lower-cost book format, it attracted droves of readers who had been reluctant or unable to shell out the full cost of hardbound novel.

Ebooks and digital readers seem to doing exactly the same thing already. Ranging from ninety-nine cents to four dollars, they seem to have made reading an affordable pleasure again. And a guilt free one, even in the current economy. After all, there is a lot less buyers' remorse that goes with spending $2.99 to download a promising looking ebook onto a Kindle or iPad than plunking down $26 for a new novel from a favorite author that is only available in hardbound. And it's a lot easier to be impulsive with that download, rather than having to jump in the car and spend ten minutes in the aisle of a book store staring at that $16 - $26 dollar price tag and wondering if its worth it. And with even paperbacks often clocking in at $10.99, impulsive book buys have been becoming a thing of the past.

So this is certainly mixed news. Good in that it will hopefully fix an industry that may well be cheating intellectual property creators of the money that they are contractually owed. Bad in that even if it does not sound the death knell for a publishing genre that has been struggling for several decades, it may result in a disruption and freeze on buying for sometime to come.

Oh well. Onward to ebooks and digital publishing for us neopros. Glory in the online realm first, then the heady thrill of paper a few years down the road.

Two Photojournalists Killed Covering Battle In Libya

Two Photojournalists Killed Covering Battle In Libya : NPR

This is definitely a sad one. For me, Tim Hetherington's Restrepo was the only worthwhile documentary to come out of either the Afghan or Iraq wars. I tried watching several over the course of the past decade, and sadly the filmmakers' political views were normally about as subtle as a baseball bat to the face.

There's almost a kind of inverse law that seems to apply to war documentaries. The more a filmmaker insists that his or her work is neutral piece, the further it sits towards one side or the other of the ideological spectrum.

While Restrepo does not always show the US Army in its best light, it does as good a job as I've seen so far of documenting the life of a small group of infantrymen spending over a year in a hazardous combat zone, as well as chaos and confusion of battle and the sometimes bloody difficulties of waging war in a tribal mountain environment where politics are clan based.

So I'm definitely ruing the loss of Herterington and the even perspective that he brought to covering armed conflicts.

Tuesday, April 19, 2011

Recommendations and happy news

Fellow Writers of the Future 26 winner Tom Crosshill has an excellent story in this month's edition of Light Speed Magazine. "Mama, We are Zhenya, Your Son" features quantum strangeness as seen through the eyes of a Russian child caught up in a sinister experiment. I warmly recommend it for those who enjoy mind-bending scientific concepts fused with good, human-level storytelling. That and it's completely free online!

Mama, We are Zhenya, Your Son by Tom Crosshill | Lightspeed Magazine

Also, Tyler Carter, the extremely talented illustrator who did the fantastic artwork for my WOTF26 story "Lisa with Child," has won another major award and gotten to make a second trip to Hollywood. This time to meet with A-list celebrities and receive some well-earned accolades for a big accomplishment: Namely winning a Student Emmy for his gorgeous BYU CGI short film, DreamGiver.



Tyler had already interned with two major animation studios prior to this achievement, so it's safe bet that you will be seeing more of his work on the screens of movie theaters in the coming years.

Friday, April 15, 2011

Mt. Adams photographic expedition...kind of...

I spent a morning spent up in the highlands at the base of Mt. Adams to take some photos of the mountain. Only the volcano decided to hide behind a localized veil of clouds on an otherwise sunny day

So, instead I motored around and took micro shots of the highland countryside while waiting for the clouds to lift.




Still winter in the mountains, while spring has sprung below



At last, the volcano begins to show itself. Big and bulky, Adams creates its own weather systems.


Did I mention it creates its own weather?


Gracile and tapered, Mt. Hood lacks the broad shoulders needed to form large storms. Clouds merely slide around her slender form.


An photo from a few years back. It may not look it from this distance, but Adams is tall and vast enough that Mt. Hood would fit inside with room to spare.

Tuesday, April 12, 2011

50 years ago



Our future as a global technological civilization lies in space exploration. The metals and light elements essential to maintaining and expanding our technology base exist in the asteroids of our Solar System on a scale that dwarfs the dwindling amounts that remain reasonably accessible to us on Earth. Ideally, one day the operations of industry will take place in orbit, using resources and power extracted from stony celestial bodies and solar radiation.

Additionally, our continued existence on Earth depends on learning how to live in space.

Specifically within closed systems and robust habitats that will allow us to survive as societies and as unique cultures when our home world and the cosmos throw catastrophes on a mind-numbing scale at us. While the chances of a black swan event like a super volcano eruption, dinosaur killer asteroid, or power-grid destroying solar storm on the scale of the October 1859 event are slim in any given year, the occurrence of such events remain only a matter of time. Adapting some of the technologies and methods of life off-Earth to life on our native planet will allow us to not only live more efficiently, but prepare us to live through many of the worst case scenarios that we will face in the long haul.

I also believe that the future of the human mind waits at least in part in space. The adaptive pantropic biotechnologies required to let us live in environments radically different from those that we evolved in will push us to continue the development of the human body and mind. This will come not just from the expansion of existing capabilities, but also the creation new ones, the sum of which will take us beyond the scope of what we call human nature--this beta version mode of being shaped by the contingent legacy systems of our brains and biology as originally adapted to a world of hunting and gathering and tribal socialization.

Don't get me wrong. I like human nature, and I hope that we not only conserve its best aspects, but that there will always be individuals who chose to stick with its original format, out of sheer stubbornness if nothing else. But it would be a better and infinitely more interesting universe if human nature was only the root of something larger and more wonderful. Certainly something more durable on the timescale of our universe than the brief and flitting existence as a species that we are currently faced with.

Sunday, April 10, 2011

Sensory modalities

I took a drive up the White Salmon River Canyon to shoot some photos of Mt. Adams this past week, and ended up having a unexpected moment of profound silence after climbing out of the car to wait for the clouds to lift off the big volcano (upper left corner in the photo below).



Utterly quiet at first. No bird song, no susurration of wind, no animal cries or even the rustling scurry of small furry things--the mountain clearing and adjoining forest apparently still in the final stage of winter's suspended animation at this altitude. After a minute or two, the nearly subliminal sound of melt water trickling out of nearby mounds of snow grew audible, seemingly ex nihlo from the void of sensory deprivation. After several additional minutes of deliberate listening I registered a distant breeze in the top of the pines in the direction of the nearest highland valley. Then a lower, enveloping murmur of water in motion all around me. Not just the melt from the nearest piles of snow, but liquid seeping beneath the logged clearing's floor of dry pine needles. Last to join this continuum of expanding awareness was the faint, shifting rasp of those millions of fallen needles as they warmed quickly in the direct sunlight after a night of well below freezing temps. Almost a soft, surf-like roar on the very faintest, lowest edge of hearing.

I'd almost forgotten what it's like to be in such a complete silence, and just how much information there is in the bandwidth of human hearing once the mind has adjusted to it.



One thing I very much miss from the days of being an Army cavalry scout is just how plugged into the environment it's possible to be. We really use precious little of our in-built audio capabilities when living in an industrial society.

And that's true of other senses as well. One of my favorites memories of gradually becoming aware of just how much we miss in the world around us is from the train up for a ground invasion of Kosovo during the NATO air war against Serbian forces there in the spring of 1999. While the main body of the armor battalion I was stationed with fought its way through a narrow pass against the Combat Maneuver Training Center's opposition force at Hoensfeld, Germany, we scouts slipped around and then spread out behind the constricted terrain to put eyes on choke points farther along the lines of advance, and to check for additional enemy defenses.

After several hours of quietly following a wood line and locating a pair of concealed enemy observation posts, my section sergeant and I came across a narrow point between two forested hillsides that formed a natural defile--a confined passage that would constrict the movement of an armor formation and make an excellent killing zone.

Only there was no one there. Suspicious, we dropped our rucks and crawled carefully through the flora of the defile's southern slope; then studied the slope on the far side. Still nothing. So the section sergeant had us wait, lying in the bushes and staring for several minutes.

And suddenly it was all there. Camouflaged fighting positions, hidden trenches, concealed bunkers, and several dozen soldiers in battle dress, eating MREs, performing maintenance, and keeping watch in total silence. It was very much like one of the Magic Eye paintings popular in the early 90s. Only rather than a confusion of pixels hiding a crude stereographic 3D image, the chaotic visual noise of the German forest and hills had yielded man-made patterns of simulated lethality.

As time went on, I grew more adept at seeing what was out in the world before me, rather than seeing my expectations of what should be there. Or differently put, seeing the physical world rather than the assumptions that my mind populated it with.




Part of what made my senses so keen back then was my gradual acclimation to dealing with natural information rather than cultural. Natural information being sensory impressions of the physical world and the short-term, immediate implications of those perceived objects and events. It's being able to pick out a faint wisp of smoke and realize that it implies the presence of fire. This is an ancient category of data. A primal type that our brains originally evolved to process. It's similar to and overlapping, but different from the abstract cultural information that is social relations, cultural norms, and behavioral expectation.

Both types involve immensely sophisticated, associative modeling in the brain, but cultural information is tied into some recently evolved neurobiological data processing modules that generate an awareness of other human beings' perceptions, and their potential emotional responses to both present events and possible future situations.

Natural information is a product of modeling the external world; cultural information is glimpse of the inner worlds of others, both as individuals and as groups. Natural information is focused more on the here and now, and only a little on what comes next. Cultural information, meanwhile, is much more probabilistic, and it goes hand-glove with those simulations of the future that we call reasoning and anticipation.

One negative aspect of my past two years of living in the middle of a large city been the temptation to give up that natural-information level of awareness out of politeness. The urban landscape is obviously a vastly different volume of space from a mountain clearing near the foot of a glacier-covered volcano. There are a variety of social protocols that govern the very act of seeing in the shared spaces of an urban foot traffic environment. Among those informal norms is a very limited amount of acceptable eye contact with strangers on the friendly streets of Portland, and precious little time in which it's appropriate to take note of another human being's body language, physical proximity, and what they're doing with their hands.

Active sight violates all kinds of common social expectations. The requisite scanning and the punctuated pauses of deliberate, evaluative gazes make people nervous. The timing is all wrong, and being so deliberately engaged sends a warning signal. While active sight can create an zen-like sense of connection to the fluid and ever-evolving environment for the watcher, it generally provokes a negative emotional response in bystanders.

As I wrote about last year, learning to transgress social spaces is a learned skill set. It's one that's very much tied into how we conceptualize different spaces, as well as the cultural weight of the rules that we assign to those environments. Such rules and models very much affect not just our expectations of what's in those spaces, but how we let ourselves observe them.

Off course some degree of compromise is necessary for a harmonious life in a largely tranquil urban setting, but there are days when it's fun or simply reassuring to put on a pair of sunglasses and really look around to see what's actually out there, rather than being content with the choppy glimpses that are socially appropriate to existing in a modern society.

Thursday, April 07, 2011

The semiarid eastern Columbia Gorge

East of the Cascades




A working replica of Stonehenge built at nearly the same latitude as the original. A bit of Northwestern odd that was originally a memorial built for fallen soldiers by a prominent local member of the pacifistic Society of Friends/




Unforgettable – Chapter 1 - Unforgettable: a novel by Eric James Stone

Hugo nominee and Writers of the Future winner Eric J Stone's free serial e-book experiment. Spies, tradecraft, and an aspiring spook with an unusual talent.


Unforgettable – Chapter 1 - Unforgettable: a novel by Eric James Stone

Sunday, April 03, 2011

Friday, April 01, 2011

Locus Online News � Bacigalupi and Watts to Collaborate on Depressing Dystopian Shared World Anthology

So at least some of the people in the industry do have a sense of humor about how dark and cynical science fiction has gotten. I certainly had good chuckle over this April first Locus 'article.'

Locus Online News � Bacigalupi and Watts to Collaborate on Depressing Dystopian Shared World Anthology

Human thought

The introduction from the document that I use to keep track of everything I've learned about human thought over the past fifteen years.

The Evolution of Human Thought

“Something beyond our understanding occurs...the transformation of an objective cerebral computation into a subjective experience.”
        -Oliver Sachs

“Minds are what brains do.”
        -Marvin Minsky

Basic sentience likely arises from a mapping function in which the parallel processes of sight, sound, touch, smell, and feel are woven together into a representation of the external world and internal body[1]. Such an animal consciousness perceives only the timeless continuum of the now, and its first apprehension of the past was the short-term memory system, which holds impressions for just seconds or minutes in the form of electrical currents between neurons[2].

Procedural memory--also known as reflex memory or muscle memory--was the first system of durable memory to arise in our evolutionary history[3]. Procedural memory involves protein synthesis, writing out new neurons or strengthening existing neural connections, thus allowing organisms to learn new physical skills as well as to refine innate reflexive skills passed down genetically in the form of instinct.

Declarative memory that can be consciously accessed arose as a mechanism for storing and recalling two types of information: episodic and semantic. The episodic memory subsystem encodes the autobiographical recall of past situations that have been experienced[4]. The second subsystem, semantic memory, stores conceptual information that is independent of the situation in which it was learned.[5]

Semantic memory allows for the creation of meaning by enabling individuals to remember the abstract commonalities of multiple events, and thus discover underlying relationships. Knowledge of such cause-and-effect dynamics is preserved in the form of ideas and images known as concepts. Not surprisingly, words, perhaps the most efficient carriers of abstract meaning, are stored within the semantic memory.

Once written in the declarative memory, experiential and semantic information is not laid out in neat, static lines, but in interconnected web-like networks. The brain by its very layout defines the present by the past, cross-referencing new sensory information with preexisting knowledge as part of a process of constant conceptual association[6]. Much of this takes place within the highly developed association areas of the cerebral cortex, the brain’s outermost layer, which neurologically distinguishes humans from other animals and even our primate cousins.

The ability to create complex association between bodies of conceptual knowledge and processed sensory information allows the forebrain to add a kind of editorial commentary to animal consciousness--one that highlights underlying causes and effects. This may well be the source of human sapience, which includes not only a basic sentient awareness of the world, but also our perceptions of complex implications, abstract dynamics, and the existence of complex systems made up of temporary components. In other words, we can infer the network of relationships that is a forest, despite only ever seeing the trees.  

Additionally, within the outer cortex, external perception, internal thoughts, and emotional triggers are integrated and balanced to help mold volitional responses[7]. At the heart of this decision-making executive function is the orbital frontal cortex[8] [9]. Situated behind the eyes, and massively integrated into the emotion-mediating limbic system, this area helps to shape emotional reactions by inhibiting or modulating such signals[10].

It does so in part by constructing conceptual perspectives about situations, and then using those perspectives to modify emotional responses. This is a process in which conceptual knowledge is freighted with emotion. In doing so, in imbuing concepts with emotional weight, the frontal lobes help to create a coherent and holistic worldview known as a paradigm[11].

The cognitive constituents of a paradigm are concepts, ideas, and the simulacra models of physical objects[12]. Tangled webs of interrelated concepts, ideas, and simulacra form sub-maps known as fields of knowledge. Each field of knowledge covers an aspect of the world and gives rise to an understanding of that broad area’s dynamics. The overall paradigm that emerges from these fields of knowledge is a worldview and meta-model that influences which aspects of reality a person notices.

In the words of the political philosopher Samuel P. Huntington, we use our paradigms to:

1. order and generalize about reality;
2. understand causal relationships among phenomena;
3. anticipate, and if we are lucky, predict future developments;
4. distinguish what is important from what is unimportant; and
5. show us what paths we should take to achieve our goals.

The world-awareness that a paradigm makes possible is tool of unprecedented power, extending our awareness of possible consequences and potential outcomes into the future, as much as our short-term and episodic memories allow us to look back into the past. This has allowed us an unmatched flexibility in fulfilling our survival needs. However, to be fully utilized, this world awareness is accompanied by an advanced awareness of self--a complex and emotionally-driven model which incorporates assessments of personal capabilities as well as needs that must be satisfied. This requirement for a representation of self gave rise to the neurological machinery of self-image[13].

The building blocks of this self-image are schema, which are emotional convictions about capability and self worth. The setting of durable schema is large part of childhood-development. Such convictions are not easily altered or changed, and the lifetime evolution of schema is the core story of personal growth and maturation. Additionally, self-image is heavily influenced by the opinions of others. Our ancestors evolved in a niche in which their survival was closely tied to their ability to function within a troop or family band, and it may be that the network of mirror neurons[14] that run through the emotional limbic system of our brains play a crucial role in giving the views of others a weighty impact on our self-image.

While schema are the unique property of individuals, concepts and ideas can be communicated and shared. The rise of syntax-structured language either allowed or accelerated the process of these memes accruing in common bodies of knowledge and values known as culture[15]. Cultures act as repository of such memetic information, allowing concepts and ideas to be passed down through generations. This has allowed concepts to undergo a multigenerational process of Lamarckian evolution[16]. Bodies of culture (mythos) helped to generate the material stability and intellectual climate that gave rise to the logos: a body of rationally derived concepts.

The fruit of the logos, technology, has impacted the development of human thought in part by creating lives of leisure, the likes of which our forebears could have scarcely imagined. No longer burdened by the struggles of day-to-day survival, the dreams, fantasies, and pursuit of happiness have begun to play an ever-larger role in the lives of the post-industrial world’s inhabitants. Where their grandparents were pragmatists who soldiered through lifetimes of limited opportunities, current generations are faced with the temptations, possibilities, pitfalls, and challenges offered by economies of affluence.

This essay is about the ongoing evolution of human thought--from its biological origins to the distinct religious and secular paradigms that have arisen during its course. It also contains an examination of emotion, as cognition and feeling are inextricably linked in the processes of thought in a very real neurobiological sense. Emotions, in a sense, are the weight of thought as situational awareness, concepts, memories, ideas, risks, and potential outcomes are weighed against one another in the processes of goal setting and decision making. Understanding thought also means understanding the brain, and so we will first examine the organ of thought and the evolutionary forces that shaped it.

The interplay of genes and environment

“Human brains are evolved to be molded by experience…”
-Martin Teicher,“Scars That Won’t Heal: The Neurobiology of Child Abuse”

As animals, we humans are generalists who have adapted to new climates by altering old behaviors or creating new ones. We have established ourselves above the Arctic Circle by innovating new types of clothing, dwellings, tools, and local hunting techniques. Though physically evolved for walking, running, and climbing, we have taught ourselves not only how to fly, but to navigate by stars, and ascend to the moon. We invent new behaviors with a frequency not seen elsewhere in the natural world.

In a sense, we humans embody a paradoxical tension between nature and nurture--between genetically encoded behavior and behaviors learned from our surroundings. We are the product of an animal line of evolution, with a physical shape and bodily needs that necessitates a broad spectrum of survival and social behaviors that defines us as a species. We also carry genetic drives and reflexes, which bias us towards specific modes of behavior within the continuum of all possible behaviors, especially during our earliest years of life. At the same, time our lineage has also endowed us with an associative cortex and all the innovative behavioral flexibility that this neurological wetware makes possible. 

Thus, rather than a genetically programmed mind, we have guidelines and innate biases as well as a receptiveness to environmental influences. Instead of a complete body of instinctual knowledge, we are predisposed to be meme gatherers, language generators, society creators, and possibly even the intuitive authors of moral systems in order to facilitate group cohesion and the reproduction of our shared genes and memes[17].

Additionally, our ancestors evolved during an age extreme climate upheavals. The past three million years have seen repeated, dramatic climate shifts.  Our planet has continued to become a cooler and dryer place with factors like orbital perturbations and the ongoing growth of the Himalaya Mountains affecting Earth's global weather patterns. Being subjected to climate shifts and extreme weather events like decades-long droughts may well have played a significant role in driving the evolution of intelligence and behavioral flexibility in hominids and our fellow primates as they were forced to locate and exploit new sources of food during protracted periods of chaos. That dire need for varied food sources may also explain why some of our earliest ancestors switched from being herbivorous to omnivorous.

The Niches we Inhabit 

The status of our ancestors as omnivores played a central role in the mental evolution of humankind[18]. Our ability to choose between a wide variety of plants and prey animals presents us with an array of troublesome decisions summed up in Dr. Paul Rozin’s famous phrase, the omnivore’s dilemma. In short, where dedicated carnivores and herbivores spend little if any mental energy on deciding what to eat[19], omnivores such as humans are confronted with a wide array of potential victuals. Some of these are nourishing, and in times of scarcity may make the difference between life and death. Or, these foods might well be toxic, leading to a painful debilitation or demise.

Life as an omnivore calls for analytic intelligence, a keen pallet to detect possible toxins, and a prodigious memory to recall what can and can not be safely eaten. In humans, it may have also favored the evolution of language instincts in part as a means of sharing information on both the availability and safety of food types, as well as possibly as a means of facilitating cooperative hunting.[20] As Michael Pollan points out in his book, The Omnivore’s Dilemma, cultures act as repositories of food lore, often encoding warnings into the very names of lethal foods, like the death cap mushroom.

While gestures likely formed an important component of early human communications about food and other issues, the development of speech necessitated significant physiological changes. Our ancestors’ voice boxes elongated into a configuration that permitted greater verbal sophistication, despite the increased risk of choking to death while swallowing food. The speech center of their brains also swelled, and this may have been a key step in our ability to employ complex abstract reasoning[21]

In addition to the choices offered by an array of potentially edible fungi and plants, the ability of our ancestors to eat meat shaped the evolution of our brains. Acquiring calorie-rich flesh, whether from hunting, scavenging, or cracking open bones with basic stone tools to access the nutrient-rich marrow, allowed for the survival of individuals with larger, calorie-intensive brains, as well as supplying a steady source of the vitamin B-12 essential for the functioning of the brain and peripheral nervous system.[22]

Living in hierarchical troops and bands, our omnivorous ancestors increasingly relied upon cooperation, tool use, and a complex reasoning ability in order to survive. Both autobiographical recall and semantic memory grew more complex in this environment, in order to facilitate both the tool use and the advanced social behaviors[23].

A mutation took place, shrinking the jaw and reducing the muscle strain placed on the front of the skull. This cleared the way for the forebrain to increase to the maximum size that the birth cannel could pass.[24] Static-instinctual behavior likely atrophied in favor of the association area’s dynamic analytical abilities, leaving behind a basic framework of encoded motor-reflexes, developmental biases, and emotional infrastructure in neurons and hormones designed to predispose humans towards modes of useful behavior.[25]

However useful, the enlargement of the forebrain carried a high developmental cost. Where many animals can walk and navigate shortly after birth, human infants are entirely dependent on their parents for mobility and feeding as their brains unfold and blossom during the first years of life, setting the stage for our later behavioral complexity and sophisticated reasoning abilities. Along with the need for a long neurological maturation period, our infant bodies require time to grow large enough to support our oversized heads. The emergence of the association cortex also meant evolving increasingly sophisticated emotions to encourage us to use our new capabilities, and so the human limbic system increased in complexity alongside the forebrain.

Additionally, our efficiency as social meme creator-gatherers increased when individuals lived longer, so that they could assemble and preserve complete bodies of memetic knowledge. This meant that the mutation rate slowed in order to retard the development of cancers. Large intelligent mammals with complex communication schemes such as humans, apes, and whales have half of the genetic mutation rate and consequently half the rate of cancer as dogs and other short-lived, socially simpler mammals[26].

Uniquely, we humans appear to be the only primates in which males do not leave their native family band or troop upon reaching puberty. This may contribute to both stability of groups and the long-term accumulation of knowledge. Additionally, we are unique in the degree that we form nested groups--a family that is part of a tribe, which in turn can be a sub-group within an even large social organization.[27]

Little fossil evidence has been uncovered concerning the emergence of symbolic thought and advanced reasoning. Anatomically modern humans have been around for between 100,000 and 200,000 years with brain sizes and shapes identical to those of today. However, extant simple kits of stone tools appear to have remained largely static for most of this period. There is, however, some physical evidence for an abrupt behavioral change around fifty thousand years ago in a wave of symbolic art and new arrays of advanced Paleolithic tools. During this period behaviorally modern Homo sapiens also began to displace or assimilate neighboring Homo erectus and Neanderthal populations in the Near East, Southern Asia, and later, Europe.[28]

One possibility that has been put forward was that Homo sapiens was pushed into a significantly increased utilization of its abstract reasoning abilities not by a genetic change in a small local population, but rather by an environmental or social stimulus that affected the organization of the brains of individuals. One candidate is grammatically structured spoken language.

With its complex structures, and its ability to facilitate symbolic thought and communications through words, grammatically structured language stimulates complex neuronal organization within the cortices associated with verbal communications, semantic memory, and conceptual associations.[29] In other words, the mental strains of acquiring and using of natural language may have been what pushed our ancestors into full sapience as individuals[30].

As symbols embedded with conceptual meaning, words allow humans to internally manipulate and externally communicate large numbers of concepts with a speed and efficiency seen nowhere else in the natural world. Words effectively allow us to rapidly string together and communicate dozens of concepts in grammatical structures without having to dwell on each of those concepts.

While humans and other hominini also made use of internal visual images in the act of thinking, the addition of words facilitated a world-altering increase in our capacity for symbolic thought. The impact of adding audio words as complex conveyers of concepts in a visually oriented species likely opened novel avenues of thought, even as it complimented the existing visual-based imagination and reasoning.

Assembling consciousness: Functional components of awareness
. . .

Footnotes

[1] The brain’s parietal lobe—which appears to orients us in space and time—may play a key role in shaping this consciousness. One of its roles appears to be the filtering of internal information from the continuum of sensory inputs. From there, the brain appears to use specific sets of neural pathways to process information that has been tagged as referring to the self. Information pertaining to other humans and objects within the environment travels though a different set of processing paths. Differentiating between external and internal information may be necessary for an organism to experience basic consciousness, as self. A seizure of the parietal lobe has been cited as one possible biological explanation for the mystic’s experience: a state of consciousness in which the barriers between self and the universe are dissolved, and the subject experiences all of existence as an undifferentiated unity. Additionally, the differentiating between external and internal information may be necessary for an organism to experience basic consciousness.

[2] This system of memory--essentially currents flowing between sets of neurons—may have first evolved so that an object or a threat would not vanish from awareness the moment it passes out of direct sight or hearing.

[3] Procedural memory appears to be stored within the hippocampus, the declarative in the more recently evolved cerebral cortex.

[4] Procedural memory may provide clues to individuals whose ability to create new autobiographical memories has been destroyed by neurological trauma, as happened with the fictional protagonist of the film Memento. Dr. Oliver Sachs describes a real life patient who had also been left unable to recall anything in his life after the removal of a brain tumor, in the story “The Last Hippy.” After several years in a hospital, the sheer familiarity for moving around the ward—perhaps captured by procedural memory—allowed the patient to recognize that he had been there for some time as well as recalling the names of those whom he met and frequently interacted with after his operation. Oliver Sachs, An Anthropologist on Mars (Alfred A Knopf, Inc, 1995).

[5] The semantic memory likely has its origins as a specialized system that stored information on the basis of relational location in space, as opposed to autobiographical system’s time-based organization.

[6] This is likely a part of the neurological function known as working memory. Working memory allows the brain to temporarily store information in an easy to manipulate, active form. It components are the mechanism of focus and attention, visual imagination, and audio imagination. It’ is experienced in our consciousness as comprehension and the awareness of reasoning.

[7] In other words this fusion of conceptual knowledge and inner thoughts interfaces with emotion, and might well be the wetware system that generates conscious emotion-modifying perspectives.

[8] Interestingly, this area appears to exhibit a great deal of variation between individuals, both in humans and in other primates.

[9] The limbic, anterior cingulate cortex also plays a crucial role in supplying motivation. Patients with damage to this area display little or no signs of volition, and those who recover from injuries in this cortex report having experienced an utter lack of motivation to take self-initiated action or to respond to events around them. In the words of neurologist Antonio Damasio the anterior cingulate cortex is a location within the brain where “…the systems concerned with emotion/feeling, attention, and working memory interact so intimately that they constitute the source for the energy of both external action…and internal action….” Antonio R. Damasio, Descartes’ Error: Emotion, Reason, and the Human Brain (Penguin Putnam, New York, New York, 1994), pgs, 71-74.

[10]The inhibitory/modulation processes are a function which can obviously be impaired by alcohol or various narcotics. Also, some neurologists classify the orbital frontal cortex as a part of the limbic system due to the high degree of integration between the two areas.

[11] This should not be taken to mean that the orbital frontal complex is the site of our consciousness. Memory is distributed in various regions of the brain, and sensory information is processed in discrete cortices. Awareness may be a series of synchronized parallel process which generate simultaneous representations of external events and internal states. As such, consciousness may arise from cross talk between the various cortices and nuclei that are active during such a representational process. These representations in turn, may be composed of patterns of firing potential within circuits of neurons, which when fired as a group cause a sensory cortex to generate or reconstitute an image or sensation. Or they can cause physical movement by activating a motor cortex. Such depositional representations are proposed by neurologist Antonio Damasio in his book, Descartes Error: Emotion, Reason, and the Human Brain, pgs. 94, 102-105

[12] Simulacra are the autobiographical and muscle memory recall of physical objects and their properties

[13] The need for self image may have also given rise to the internal narrative voice heard within ones mind. Or this inner voice may have been a means of harnessing the spoken to word to serve as a medium of thought, creating an unprecedented power to solidify vaguely sensed abstracts into concrete concepts. One explanatory hypothesis for schizophrenia is that the internal voice of the mind is a fusion of several sub-voices that represent the various functions or inputs of awareness. The breakdown of a timing mechanism may cause the various sub-voices to fall out of sync, thus creating a perception of the component voices as irresistible or near-irresistible external speakers, despite having an internal origin.

[14] Mirror neurons fire when an individual observes the actions, expressions, or experiences of another, causing the watcher to experience an echo of the observed person’s emotion or physical sensation. Such a sympathetic experience may help in creating emotional interpersonal bonds, as well as with imitating performed actions, and play a role in self-contemplation. As one scientist notes "…mirror neurons may enable humans to see themselves as others see them, which may be an essential ability for self-awareness and introspection.” Broken Mirrors A Theory of Autism, Lindsay M. Oberman and Vilayanur S. Ramachandran (Scientific American Reports: Special Edition on Child Development, September 11, 2007) pg. 25.

[15] Various troops and bands of monkeys and apes appear to develop unique bodies of grooming practices, social gestures, and forging techniques. Thus it can be argued that culture in predates both humans and syntax-structured languages. Early primate memes were passed on by imitation and non-verbal instruction rather than the spoken word. Additionally, language as we know it may have evolved out of a combination of gestures and cries.

[16] Lamarckism is the idea and an organism passes on characteristics to its offspring acquired during its life time. An example of this is that if an animal builds strength through exertion, it will pass this strength genetically to its progeny. While widely discredited in modern evolutionary biology, Lamarckism has enjoyed something of a resurgence when applied to memes, in that these mental replicators are actively modified by the thinkers who hold them, and then passed along thus changed.

[17] MIT linguist Noam Chomsky has famously proposed a set of innate grammar rules to explain how children intuitively language and commonalities of human languages found around the world. Harvard biologist Marc Hauser proposes a similar inherent set of drives that encourage humans to develop moral systems, just monkeys have been observed to display propensities towards fairness or reciprocal altruism. While I personally believe both men’s hypotheses to be correct, Hauser’s 2006 book Moral Minds presents insufficient evidence to successfully argue his claims.

[18] Among other physical traits shaped by our place as omnivores are our jaws and teeth, which allow us to process both plants and meat. Additionally, the ability to control fire may have allowed the intestines and jaws of our ancestors to shrink considerably, as we no longer needed such muscular mandibles and extensive guts to process an all-raw diet. Our largely vestigial vermiform intestinal appendixes appear to be a non-functional holdover from their era of larger intestines, and may have allowed our ancestors to digest uncooked leaves. Individuals born with out appendixes suffer no ill effects or drastically reduced digestive capabilities. The taming of fire also affected our species’ development by vastly expanding the amount of plants and animals that could be eaten, as cooking broke down many of the toxins, bacteria, and structural features that prevented us from consuming many types of flora and fauna. Finally, this reduction of the jaw its need for an extensive infrastructure of tendons and anchor points may have been key in allowing our foreheads to expand, becoming less slopped in order to accommodate a large cerebral cortex.

[19] The most famous example of the impact of a single-food, low-thought diet is that of the koala. These mammals apparently went from eating a varied diet, to dinning exclusively on eucalyptus leaves. As a result, natural selection worked against the species’ large, calorie intensive brains, resulting in a congenitally atrophied, minimalist brain, which occupies only 60% of the cranial cavity--the rest of the skull’s internal volume being filled with fluid.

[20] The evolution of the both language instincts as well as the elongation of vocal cords may well have not been driven by any single use, but rather its broad spectrum of usefulness. If this was the case, then it’s possible that increasing lingual prowess may have become a driver behind the evolution of multiple human traits. Unfortunately we do not have direct fossil or genetic evidence that describes the emergence of syntaxial language.

[21] Broca’s area: A region in the frontal lobe that generates articulated speech.

[22] Not surprisingly, omnivores such as primates, rats, and bears tend to be among the most curious and intelligent types of species in terms of learning and problem solving. Dolphins with their complex cooperative hunting behaviors are also highly intelligent. Also remarkably bright are elephants, who have varied diets, complex social hierarchies, and whose migratory range requires a good memory of water and food locations for survival during lean years.

[23] The semantic memory may have evolved from a distinct spatial memory system designed to store information about objects in the context of location (space) rather than in the temporal order of autobiographical recall. This system then may have then taken on the additional function of storing general knowledge of positive and negative stimuli independent of any single situation and therefore common to multiple occurrences. Such a general recall of factors that are independent of any one situation would be useful for making predictions during novel scenarios.

[24] Rather than being the products of entirely novel mutations, it is possible that phenotypes (traits) such as a small jaw, flat face, and enlarged brow on a proportionally large skull are actually juvenile primate traits that are retained in adult humans. Some types of neuronplasticity in the human brain may likewise be extensions of early-life neuroplasticity from when the brains of fetal, infant, and juvenile primates undergo extensive genetically driven development along with intense periods of organization driven by external learning and socialization stimuli . One possible example of this is the ability of human children to rapidly and intuitively acquire complex syntaxial languages up to around the age of sixteen. This linguistic plasticity of the brain is why def children achieve the best integration of cochlear implants that are implanted early in life, preferably prior to the acquisition of languages. The retention of juvenile characteristics in adults is known as neoteny.

[25] More on the specifics of this later.

[26] Princeton University’s long-running Amboseli project has found similar rates of aging among baboons during its multi-generational observation of populations of yellow baboons. It is possible that our fellow primates who depended on collecting large bodies of learned food gathering behaviors as well s the locations of seasonal food supplies are similarly long lived.

[27] Nested groups can take place among other primates in unusual conditions of confinement. The author worked with a large group of three hundred and sixty Japanese snow monkeys living in a three acre corral at the Oregon National Primate Research Center in Beaverton, Oregon. The overall group structure consisted of five ranked families with both internal hierarchies and a structure of dominance between the families.

[28] Physical evidence exists in the form of a finger bone and accompany gene sequences that a hominid species contemporaneous to H. sapiens and H. neanderthalus lived on the Eurasian continent. The Denisovans appear to have contributed to human genome, which genes being found among modern ethnic Melanesians on the Melanesia islands.

[29] A hypothesis advanced by Philip Lieberman of Brown University argues that it was the fine motor control developed for walking upright that permitted the emergence of syntax speech. Speech, walking, and motor functions are regulated by shared neurological circuits located in the subcortical basal ganglia region of the brain. If this hypothesis is true, then the usefulness of upright mobility was a strong factor in rise of language, and should be counted as a key evolutionary force in our mental development. That said, the emergence of bipedialism in hominini is not fully understood. Potential factors that may have contributed to its development from knuckle walking quadrupedialism include enhanced cooling by exposing less surface than quadrupeds area to direct solar heating—especially important for species with large, heat-generating brains; increased efficiency in long distance travel over extant primate knuckle walking, including the ability to follow wounded prey over long distances; the ability conferred by increased height to spot ambush predators at greater ranges.

[30] Unfortunates who do not learn a syntaxial language prior to age sixteen when the brain’s final stage of linguistic plasticity shuts off , commonly suffer deficits in abstract reasoning for the rest of their lives. At a clinical level language is very much recognized as a force that pushes the brain to develop to the advanced state of organization that is accepted as the normal range of reasoning abilities in adults.