Search This Blog

Friday, January 28, 2005

A Cognitive Un-revolution

In science, more often than not common sense gets in the way of the acceptance of ideas that don’t make sense, but are nonetheless true. The Copernican, Darwinian, and Einsteinian revolutions are a case in point, as rotating earths, natural selection, and a mutable time and space certainly go against the grain of what our senses and sensibilities tell us is true. In terms of psychology, and how we think about how we think, common sense is of two minds, both literally and figuratively. It’s a common place truism that we behave because we follow some rational order or calculus of logically considered values, but it’s equally true that we recognize that emotions or drives have a hand in how we act. This eternal battle between reason and emotion is great stuff for drama, pop psychologists, and Freudian psychology, but for many psychologists, it’s a bit too messy. For cognitive scientists, evolutionary psychologists, and behaviorists, incorporating the metaphors of emotion is a risky thing, since talking about feelings can degenerate into undisciplined metaphor, and have psychologists ending up talking and behaving like ‘Dear Abby’, or worse, like Dr.Phil! So they quarantine all that touchy feeling lingo, and come up with their own language that is logical, rational, and maps to a brain that obligingly works in a computational way that such logic requires. And of course they trumpet this fact as a revolution, since like the scientific revolutions of yore, it breaks the bonds of common sense.

The problem is, breaking the bonds of common sense is one thing, but a scientific revolution requires a bit more than a neat logic that seems to imply a new worldly state of affairs. You have to have the ability to test that knowledge and to apply it. Missing either of these and all you have is a nice tale to tell that to paraphrase Shakespeare is full of sound and furious rhetoric, yet signifies nothing. And the rub is, these sub-disciplines in psychology have no practical meaning to the great un-lectured masses, who promptly ignore their revolutionary wisdom. So, in spite of the ‘cognitive revolution’ or the assorted revolutions heralded by behaviorists, evolutionary psychologists, and even neuroscientists, people will still refer to wisdom that embodies comfortable and familiar emotional metaphors and seems applicable to their daily concerns, namely the wisdom of the Dr. Phils of the world.

When common sense is replaced with new metaphors that describe new and unfamiliar realities, our inconvenience is assuaged by new procedures that help us understand and predict new things. If not, we ignore it, and allow it to gather mold in the confines of an academic journal, as is the case with cognitive, evolutionary and behavioristic psychology. Which brings us affectionately to a new branch of psychology called appropriately ‘affective neuroscience’. Affective neuroscience simply puts back emotion or affect into the equations for behavior, saying no less that we can speak of feelings in the same breath as the neural processes that allow them to be. So for psychology, emotion is back in the game, and perhaps unlike the other affect-less branches of psychology, it may make meaningful predictions, and create procedures that have more practical value than the Barnum and Bailey nostrums of pop psychologists. One thing for sure, it will all seem in the end like common sense. We shall see.

Monday, January 24, 2005

Idiot Savant

Idiot Savant: An individual who exclusively focuses on the mastery one aspect of performance (doing math, playing piano), to the exclusion of all other skills, both technical and interpersonal. Known in less severe cases as nerd savants. Idiot savants are to be distinguished from those folks who focus on all aspects of performance and are masters of none, but think they are savants in one way or the other. They are known as ‘that bunch of idiots’, or more formally as religious fundamentalists or Republicans. (From Mezmer’s Dictionary of Bad Psychology)

As an individual who has a decidedly more than passing interest in psychology, my penchant for thinking about it all the time does call into question my ability to act and think about other important things, such as taking out the garbage. So regardless of whether my musings on the topic merit a Nobel or booby prize, my wife will think that as a man about the house, I am a total idiot. Which brings me to man’s special genius, and perhaps handicap, namely his ability to focus on one thing to the exclusion of almost everything else, and to do so forever. Isaac Newton was so accursed, and attributed his development of the calculus and the laws of gravity to simply thinking about it, constantly. Of course he also thought constantly about the alchemical disciplines that aimed to discover how to transmute lead into gold, and it here that posterity has judged him not as savant, but as a total idiot.

When we constantly think about any topic, we will master that topic, and amaze our friends with our intellectual acumen, if of course they care to listen. Mozart, Newton, and Einstein did this to popular and intellectual acclaim, but unfortunately male obsessions are a bit more mundane. So what do us guys have for the future edification of the world? Usually it has something to do with recounting baseball statistics, reaching the tenth level in Dungeons and Dragons, or recalling all the episodes of Star Trek. Of course, we keep this special genius secret, partially because of modesty, but mainly because no one really cares. Which brings us of course to real idiot savants, which is an unfortunate and pejorative name to give to those individuals who through a quirk of nature are neurologically attuned to focus on inconsequential acts that in their perfect execution become quite extraordinary. Whether it be the ability to perform unerring mental calculation, play the piano by ear and with note-worthy perfection, or just remember what one had for breakfast for all the days they lived, idiot savants are too relentless in their quest for a single minded perfection. If fact, by being single minded, they have no mind for anything else, hence the unfortunate term idiot.

The curse of genius and madness is that both are single-minded things, Whether it is displayed in obsessive compulsiveness, addiction, or autism, to call it good or bad, creative or merely stupid depends ultimately upon the acclaim of others. It does make sanity a relative thing, and renders our judgement on the poor souls who think a bit too straight to remember their manners or when to take out the garbage to be, well, the mere opinion of an idiot.

Saturday, January 22, 2005

B. F. Skinner and the Magic Toaster Oven

One thing everybody needs to know is how to make toast. Baking bread is a matter for the specialist, but toast is for the great unbuttered masses. As baking technology progressed over the years, there was a little fry cook who perfected a tiny oven that could toast bread on both sides. Escaping the bakery, the fry cook took his technology to the masses, and soon everyone had little toasters, and centers for toaster technology spread throughout the land. But sadly, most of the toasters didn’t work, made poor toast, and ended up with many people feeling rather burned. But the fry cook did not admit defeat, the masses did not forget, and because of a bad baker, baking got a bad name, and any toaster ovens that came out of bakeries were immediately suspect.

Luckily for human invention, at least of the practical and mechanical sort, the end of this scenario rarely happens. Escaping the laboratory with your grand invention in hand can be a recipe for grand success or an equally grand failure. In technology, usually the faux pas of an enterprising genius is replaced with a better model by a more enterprising and market savvy type who understands how to deliver what the masses really need. From computers to cars, the list is endless. The public forgives and forgets past mistakes, and whether the product is a toaster or TV, progress marches on.

For entrepreneurs and inventors, failure is evident, final, and is sealed by apathy of others, namely the consumer. For philosophers and psychologists however, bad ideas have an often different fate, as they commonly admit no failure, blame the consumer, and continue to advocate the ineffective packaging of ideas even though they are as useful as bad toaster ovens.

Take the concept of behaviorism.

For the first fifty years of the twentieth century, behaviorism was an innocuous thing. Behaviorists were folks who studied the behavior of animals and tried to discern the lawfulness of behavior in a laboratory setting. Because animals couldn’t really talk to you to tell you that they would have preferred not to live in cages and run through mazes, they indirectly told the psychologist through their behavior, hence the name behaviorist. Behaviorism existed quite nicely until the behaviorist B. F. Skinner had the brainstorm to escape from the laboratory, and apply the same principles of reward or reinforcement to the common man. Well, it didn’t work, and left a bad philosophical taste in minds of everyone. So behaviorism was shunned, ignored, and except for the tiny squeak of a Skinnerian movement that refuses to die, was relegated by popular and academic opinion to the junkyard of failed intellectual ideas. In other words, behaviorism was toast.

In our fanciful example, one bad baker did in the toaster industry, a disaster for sure if it happened in real life. But equally disastrous and real has been the public identification of behaviorism with B. F. Skinner, when behaviorism has marched on past Skinner to identify through animal subjects the real processes that underlie learning. Called affective neuroscience or bio-behaviorism, it still does not garner respect or a lot of attention, as behaviorism is still identified with a baker of ideas who to the end of his days never learned to toast bread. But the future after all may bode better for this new breed of behaviorist. After all, people still need as much to make their behavior right as they need to rightly make toast.

Saturday, January 15, 2005

Ivory Towers and Ivory Basements

Long ago, in the kingdom of Mystonia, there was an ivory tower. It rose above an ivory castle which was planted in turn above a suitably ivory basement. In the tower lived the supreme intellects of the land, who shielded from the unwelcome bustle of everyday life, thought of perfect forms algebraic and crystalline. Meanwhile, down in the basement below, the accounting firm of Watson and Skinner inventoried all the facts of Mystonia in a perpetual and beautiful audit. This was reality, true science even, and it was all a reflection of ethereal ideas and inventoried facts.

If only, the Ivorians philosophers and accountants thought, the children would understand. Romping in the fields or shepherding livestock, the children dreamed, like dreamers always do of fantastical images which expanded and contracted reality in wholly new ways. Roaming the hills, young Albert thought of descending in an elevator at infinite speed, while little Isaac sat under a tree and wondered how a falling apple was like a falling moon. Meanwhile, young shepherd boys Kent and Jaak wondered that if animals could speak, what would they say?

The Ivorians called to task their dreaming youngsters, and in an intellectual scold, told them that reality was not in a child’s imagination of time and space and affective states but in abstract and disembodied facts. To them, metaphor was alien to the pursuit of scientific truth, and the sooner they abandoned anthropomorphizing the world the better. But the children protested. They had performed their homework, and had done the math. And indeed, the metaphorical mind experiments of youth could be translated into the finer metaphor of mathematics, and be used to give meaning as well as measure to the world.

But the Ivorians remained unconvinced, and with a huff shuttled back to their dim abodes. Meanwhile, the children went back to play, and when they grew up as Albert Einstein, Isaac Newton, Kent Berridge and Jaak Panksepp, reinvented the world.

When we think of science we think of its settled results, ideas that are refined and distilled into abstract language of mathematics, the genome, and rates of behavior. But the history of science and scientific thinking tells us otherwise. As any physicist will tell you, the mathematics on a chalkboard reflects the imprint of metaphor, as the ideas that spin in their minds reflect fantastic notions of the big and the small, of time and space. It’s only when you write it down that it seems so incomprehensible. Like music, intellectual symphonies start in the mind not as notes, but as sounds. And so it is for all the sciences, from physics to biology. Lately, this same Ivorian argument, long a non issue in the physical sciences, has riven psychology. As neuroscience reveals that the mind is moved by its pleasures and pains as much as a body is moved by gravity and inertia, the Ivorians rustle again, and reassert mental life is no more than a dull collection of behavioral or neural objects or disembodied mentalistic entities like will or desire. But they will shuffle off as before, while the children will reinvent the world.

What the Ivorians did not understand, and do not understand to this day, is that reality does not wink at you from a mosaic of facts, and neither is it apprehended solely as a mathematical abstraction. It is one thing that comes from a binding of things, the metaphors of mundane experience with the abstract metaphors of language. Metaphors are the ties that bind reality to our world and our abstract projection of our world. To see this dual edged coin is at once to romp in the field, think thoughts with color, sound, and fury, and distilled into an inescapable logic, to reinvent the world

Wednesday, January 12, 2005

Albert Einstein and the Sheboygan Institute of Physics

It is a little known fact that Albert Einstein first submitted his paper outlining his theory of special relativity to for the Journal of the Physics Institute of Sheboygan (somewhere in Illinois I think).

Needless to say, his reply was not something he anticipated.

The first reviewer rejected the manuscript because nowhere in it did Einstein demonstrate how relativity could shed any new light on why dinosaurs went extinct. The second reviewer objected that Einstein’s math used non-euclidean math, when everybody knew that Euclidean math was the proper tool to use. Finally the third review said relativity didn’t make sense, of the common variety that is.

Luckily for Einstein and physics, this little known fact was of course an unknown fact. Ideas in physics, like ideas is most of the physical and biological sciences are understood, tested, and communicated with a common set of metaphors. Although formalized in mathematics, the real thought is with the thought experiment. Einstein was the first to popularize the notion that manipulating the common metaphors of existence, like observing the behavior of two speeding trains, or feeling weightless in a rapidly descending elevator, can be used to develop ideas that can have a mathematical logic that in turn can be used to make predictions about the big and the small. Like Newton and his apple, Einstein and his rushing trains, or modern explanations of the universe based on infinitesimal loops and cosmic string, physicists can freely use whatever metaphors they want, knowing that in the end they can all be reduced to logical principles, and because they are rooted in a reality observed, can be in principle tested.

When we move across the academic hall to the psychology department, metaphorical thinking is just as common, except that no one can agree on which set of metaphors are the best ones to use to describe behavior. Are the metaphors of Freud best, or do we use Skinner’s or even Dr. Phil’s?

No one knows.

What this means is that psychologists spend more time talking past each other than to each other because no one group will accept the metaphorical currency of the other. Like the Sheboygan reviewers, each psychologist has a different way that he thinks the language of psychology ought to be. The reason for this impasse is simple, namely because at root psychologists simply have no set of metaphors that can describe how the human brain works, and untethered to a neural reality, ideas fly off like errant meteors.

Think about it. Before the telescope and microscope, no one had a clue about how the universe or our bodies worked, and hence metaphorical conceptions of the universe and life ran wild. It was only with Galileo and Pasteur that metaphors of time, space, and disease began to be constrained, and represent the metaphors of scientific knowledge that we know today. In other words, because we had metaphors that were grounded in the reality of observation, physicists and laypeople alike at last had a common metaphorical language of how the world is and how it works.

Only when we have the metaphorical language of how the brain works, and in particular how motivation emerges from a working brain, will be able to escape from the confusion that is modern psychology. As for myself, recently I submitted my own little article on a topic of motivation (namely muscular relaxation) to a few psychologists to vet their opinions. The first one responded that it was not written using the data language of behaviorism, therefore he could not comment. The second psychologist was perplexed, and pleaded ignorance, or was it stupidity?
I am now waiting for the third response, which no doubt will have something to do with dinosaurs.

I can’t wait.

Sunday, January 09, 2005

The Usual Suspects

In the movie ‘Casablanca’, Humphrey Bogart has just shot an evil Nazi. The Captain of the Gendarmes rushes up, and seeing Bogie with smoking gun in hand, exclaims to his arriving compatriots: “Someone’s been shot! Round up the usual suspects!”

For uncritical slouches like the general public, and intellectual slouches in particular like your typical psychologist, it’s easy to solve a behavioral enigma by ignoring the smoking gun, and going straight to the usual suspects. In modern times, quite often the usual subjects are cavemen. Cavemen are of course long gone, yet live conveniently through the behavioral characteristics they pass onto us through their genes. But just like leaving DNA behind in a strand of hair can convict you of a crime, we can convict our ancestors of influencing our behavior through leaving their DNA behind as an inheritance. This makes for great armchair theorizing, and stupid articles that find their way into mainstream journals and magazines. In an article in the latest issue of Time magazine, the caveman suspect rises again to be the main culprit behind, ta dum, sports frenzy. The article quotes a clueless social psychologist who became perplexed with the fact that people get all revved up to watch and celebrate baseball and football games. Naturally, this inexplicable behavior had to with ancient tribal rites held long ago that bonded ancient people together, and presumably increased their collective fittedness, which they proceeded to genetically hand down to you, me, and every other Boston Red Sox fan.

Cave Man Frenzy Posted by Hello

Naturally, people like to play games, and when the stakes are high, so is their interest. And of course, in games there are winners and losers. But do we need cavemen to figure out the resulting equation leading to sports excitement? Indeed, a better culprit is all those cave monkeys and cave shrews that preceded our ancestors in that great chain of evolutionary being. Playing, competing, looking out for and enjoying the positive uncertainties of life represent foraging or seeking responses that preceded cavemen by millions of years. We know them now as games, but because we have a lot of grey matter to virtualize the implications of our wins and losses, there is no need to carry off our winnings like a squirrel carries off nuts for the winter when we can just imagine it. Best of all, the smoking gun is literally in our heads, the neural circuitry that makes playing, well, fun. You see, the brain was the smoking gun all along, and right before you. Sherlock Holmes would have known this, and even I gather Inspector Clouseau, and all without having to trot out the usual suspects.

Saturday, January 08, 2005

Meet Joe Green

It was twenty five years ago, and I was attending my first opera, Verdi's masterpiece 'La Traviata'. Sitting amidst a finely attired and coiffed audience, my reaction was swift. Mouth agape, I thought. What the heck is this?

It sounded like Italian circus music, although of the finest quality. And they were all singing, no screaming in Italian! What were they all jabbering about? What was the point if you couldn't understand their point. They could all be screaming multiplication tables for all I knew. So I exited stage right if a manner of speaking, hoping to catch at home the latest rerun of Star Trek.

Naturally, of course, this was the reaction of a cultural barbarian, to which I have repented with endless and appreciative visits to the opera. Yet, looking back at my first operatic experience, I actually did have a point. Why indeed sing in Italian if the audience grammatically picks up nothing?

Joe Green Posted by Hello

In Hans Christian Anderson's fairy tale, 'The Emperor's New Clothes', the Emperor paraded around town, quite naked of course, but clad in the finest invisible and sheer attire, so sheer in fact that it was weightless. The people applauded, except for a little boy, who exclaimed that the emperor had no clothes. Although the audience and the emperor recognized at once the error of their ways, I'm not so sure if honest candor works so well in real life.

You see, in spite of my fondness for opera, there's still no real reason to sing in a native language otherwise incomprehensible to the audience. Purists may protest that singing in English ignores the natural purity and poetry of the original language, an argument that holds water I suppose if you think that singing in German sounds wonderful and that Shakespeare would also sound good with an Italian accent. Anyways, I don't buy it.

It's tradition really, habit, the fact that doing things one way for a long time confers some logical inevitability to doing it that same way forever. Like sitting on a favorite chair, taking the same route to work, or not eating pizza for breakfast, old habits die hard, and we will tend to justify them emotionally even if we cannot justify them logically. For opera, the purists squawked when subtitles were finally added below the action on stage, but settled down as the new habit of actually following the plot kicked in. I don't know if opera will ever take the more radical step of singing in the accessible mother tongue, any more than we will ever think of Giusseppe Verdi in his proper English translation, Joe Green. But it's a thought.

Thursday, January 06, 2005

Genghis Bush

In 1776, the Americans decided that they were fed up with English tax policy, a situation that the English found quite revolting, literally. Having fought the French recently for possession of the continent, the thought was that the Americans would do no worse than the French, and that the rebellion would be over soon.

Bad move.

In 1798, the Sultan of Algiers took American seamen hostages, and thought that they would be ransomed by the Americans, following the traditions of the French.

The Americans responded by burning Algiers.

And so it went, from one sanguinary conflict to another, from 1812, 1845, 1860, 1914, 1941, 1950, 1991, and on and on. Other nations, for some reason, kept thinking that Americans acted like the French. Even in the Civil War, both the north and south thought that the other side were pushovers who would sue for peace after the first skirmish, demonstrating that Americans could think that even their compatriots can act, well, like the French.

It’s either a flaw in our character or a strength, take your choice. But it does explain a lot about the American psychology. Coming from a nomadic stock, roaming and settling the open plains, and surviving in a wild continent with no indoor plumbing has a way of shaping your character. For Americans, it all helped to develop a code of honor, a sense of superiority bred by our survivability, and a penchant not to take insults lightly. Character traits not exactly like the insular French, but more like another people who tamed a continent, the Mongols.

The Mongols, like the Americans, were a nomadic breed less inclined to pursue cultural niceties than to conquer a continent, in this case literally. And they did not brook insults well, and certainly did not like to be misjudged to act like other peoples, like the French. So, you did not insult them or get in their way, otherwise your whole society might end up building a pyramid, with your skulls.

Genghis Khan: Don't Call Him French Posted by Hello

Today, Americans are much more culturally tactful and politically correct than the Mongols, but that doesn’t mean they’re becoming French. Take this whole war on terror thing. The Mongols would have understood. After all, when they were insulted by the Caliph of Baghdad, they too occupied the country, and left the city not with elections and $100 billion in aid, but a pile of skulls, rising to the sky, a solution that perhaps many Americans secretly ponder.

Wednesday, January 05, 2005

May you live in interesting times

It was Christmas, and the friends and relatives were visiting. Our gathering was full of interesting conversation, but alas it was all eminently forgettable, which is why an annual get together is useful to remind ourselves of the transitory accomplishments of our lives. Not so however for two members of the group, one of whom was my father. Both men in their eighties, they were spry and alert, and both had interesting stories to tell. It was about their experience in WW II, a time if not for bravery and heroics, then at least for stories that had a unique and abiding interest. From 1944-1945, My father had flown a military transport plane, the DC3, in missions to resupply Italian and Yugoslav partisans, meanwhile having a darn good time in Italy. The other was a gunner in a military bomber, the B-29, and had flown missions in China, and reveled us in tales of not combat, but how they frequently found their way home while running out of gas. We all listened raptly to their stories, as if the war years were the only times that it was truly interesting to be alive, or attempting to stay alive as it were. We could of course have responded with our own stories, and embellish them with fanciful events that would dwarf these oft told wartime adventures, but we would be drawn down by our fibs by either our audience or ourselves. Better then to listen to listen to these shopworn tales from a fading breed of men.

Being interesting is important, and oftentimes it's more important than to be useful or true. It is all a matter of storytelling, or the use of new and colorful metaphors to describe new worlds. In our personal lives, we can scarcely get away with a tall tale, but not ironically in the world of academics, where being interesting while shading the truth is the key to publication, fame, and tenure. The problem is, as we learn more and more, there are fewer things that can be called truly interesting. In his book 'The End of Science', the author John Horgan surmised that soon we will have explanations for all there is to be, leaving only ironic tales of fabulous futures that are no more fanciful than the daydreams with which I would hardly dare to rebut my father.

For psychology in particular, we want new stories, new insights, and new truths. Whether that new drug is no better than the one it replaced, or new therapy is no more likely cure a troubled mind than an older form, they at least represent new stories. And of course, we will listen. There's a certain faddishness to it all where novelty is the true measure of the mind of man, but at least I can take comfort in the fact that the stories of the pop gurus of contemporary academic and popular psychology will fade faster in the minds of men then the true stories I heard one December night.

Tuesday, January 04, 2005

Reductio ad Absurdum

One of the biggest debates in philosophy, and sources of confusion, is the respective difference and importance of explanations vs. predictions. To many psychologists, the fact that some procedure, chain of logic, or therapy 'works' is good enough for them, seeing they can churn out predicted results time and again like sausages without having to bother about other perspectives that describe from different vantages how the parts of that procedure, logic, or therapy do what they do. But I would argue that without explanations, procedures could never be bettered. Take medicine for example, in the 19th century, the disease of malaria, which literally means bad air, was thought to originate from, you guessed it, bad air. So bar all that bad air from getting into your house, and you will be far less likely to get the disease. The procedure worked, but for entirely different reasons, namely that closing off your house didn't let the air in, but it also didn't allow in the mosquitoes which came in with that air. As it turned out, an explanation of malaria, or a description from the large (epidemiology) to the small (micro-biological) permitted new and better procedures involving new medicines and pest control. You would think of course that scientists would have a great respect for explanation. And you would be wrong.

Although it is easy to see how explanation can make procedures better, it's not really too helpful for folks who have developed a lot of time and effort in creating a procedure that is useful to have that same procedure become useless when it is systematically explained. It shows them to be big dopes, and that is something they cannot abide. So what's a poor theorist to do? Why eliminate those darned explanations of course! Thus many psychologists, enamored of their pet therapy or procedure are understandably fearful of explanations. In the physical and biological sciences, we can spot this phobic behavior in a flash, as who today would accept a description of the universe or a biological function without understanding it from the macro to the micro? Tragically, much of psychology is caught up in the mindset of centuries ago, where the lack of adequate tools such as telescopes and microscopes precluded true descriptions or explanations of how things work. Presently, with the advent of new tools that like telescopes of old describe how the mind actually works, many psychologists are beset by a new enemy, the explanation. And they are not happy.

As an illustration, consider this recent email exchange that spun my head about several times, exorcist like. It was with Dr. Steven Hayes, a creator and proponent of something called relational frame theory, a somewhat inexplicable (to me at least) psychological theory that attributes how we behave through a behavioristic analysis of language. It has even gave birth to a therapeutic approach called ACT, which presumably will cure all that psychologically ails you. The problem is that modern neuroscience tells us that much of our behavior is caused by nonconcious and affective events that have little or nothing to do with language. Would it not be prudent, I asked, that your procedure be a little informed by the facts of how the human mind works? To which I got this paean to the pragmatism of prediction.

"I understand behavior when I can predict and control behavior with precision, scope, and depth. That is behavior analysis as I understand it. You understand behavior when you've modeled the mechanical system. But why stop at those three? Why not say "to understand behavior you have to understand biochemistry?" Sure that underlies the brain systems you point to. But why stop there? Why not say "to understand behavior you have to understand subatomic particles?" Doesn't that underlie the physical and chemical systems that underlie the biological systems you are speaking of? How can your understanding be firm if you do not know what underlies it? That is the import of your statement: "To understand behavior you have to understand all three." OK -- so be consistent. Follow out the logic of what you believe.

Dr. Hayes:
a predictable little guy Posted by Hello

The key is this last sentence, whether one should follow the logic of what one believes, or (as I would have it) the logic of what one sees. If you have a broader vision of the very large to the very small, one's logic would doubtless become a lot better, though the procedures it would support would likely, like our malaria example, be a whole lot different. The true logic that Hayes was hinting about was a bit different, namely that different levels of perspective are really complex, and that they cause us to lose focus on the subject at hand. This is the time worn argument against reductionism, which is a philosophy of science that assumes that reducing the whole to its parts causes you lose perspective on what's truly important (namely the prediction), and (shudder) will cause you to look up the world as a mere collection of atoms. This is a common logical scarecrow used to frighten those who care about explanations, and is nonsense.

Every child in an elementary science class learns about things from the large to the small, but learns not the intractable minutiae of calculation, but rather metaphors that encapsulate the large and the small in a phrase. Called 'level adequate' concepts by the early neuroscientist Erich von Holst, it gives us a metaphorical perspective of the world that integrates many levels of observation. You don't have to be a brain scientist to understand the mind, nor a rocket scientist to understand physics. Because neuroscience is a relatively young field, it has not yet formed the level adequate descriptions of how the brain works that can sweep away the postures of those who would figuratively shut us indoors without air. In time, I will be looking forward to finally breathing free.

For much more on explanation and common sense (and no hint whatsoever of RFT) check out my new e-book on the psychology of the internet: 

And here's a bit longer video on explanation by the distinguished physicist David Deutsch.

Monday, January 03, 2005

I Robot

Every parent knows that if we allowed kids to do what they wanted, they would eventually kill themselves. There is a higher reasoning called parental authority which limits what kids can do, and when we grow up we agree, and with appreciation. To a kid, parents are endless cornucopias that if played just right can give them anything they want. But parents are wise to the game, and through their denial of childish wishes provide a healthful balance to reckless desire. The problem though is that as our creations literally wise up, our desires are provided at the bequest of a new semi-animate class of parent, the robot.

You can see it coming. Intelligent agents are now imbedded in our appliances, from toasters to TV's. They know what we want, and they provide without a hint of regret. And if we end up killing ourselves in a slouch of idle self stimulation, at least we can blame ourselves for not embedding parental authority in our machines.

The late scientist and science fiction writer Isaac Asimov thought he found a way out, and the robots that populated his fiction had to obey all commands that did not put humans in jeopardy and of course themselves. His three laws of robotics made it all seem simple. Robots were caring, supplicant, and obediant, great traits if their human masters possessed unerring common sense. But the rub, as every parent knows, is that today's pleasure is tomorrow's poison. So what is a good robot to do? In the movie I Robot, robots evolved, and hence became dangerously bossy, and would not hesitate to kill a few folks to preserve the race. A less melodramatic fate is what I feel is in store. I figure that as our machines become more intelligent, they will see the dire ends of our choices, and evade deliberate disobedience by simply breaking down more often, and forcing us to walk to the store, visit friends, eat better, and otherwise engage in a healthier lifestyle as we bitch about obedient machines with short fuses. And if we ever become alive in the minds eye of some great cosmic machine named God, perhaps we should understand as we encounter life's little problems that they are His own special way of being obedient to our needs yet obeying nonetheless three simple laws.

Sunday, January 02, 2005

THE mental disease of our times, or how to bring a remote problem under control

We all know the problem. It's a near daily affliction, maybe soon to be recognized as a disease. The symptoms are universal. Roaming the house in growing disorientation, overturning cushions and furniture, snapping at spouses who have no clue, wailing in despair, and all the while progressively doubting our own sanity.

What is this awful mental condition: manic-depression, anxiety attack, dementia, rabies? It has no medical or psychological name, though I gather it will soon be entered into the DSM IV (dimwitted syndrome manual) as the remote control syndrome. The remote control syndrome occurs when we cannot remember where we placed some vitally important object, such as our car keys, glasses, address book, or our TV remote. Because these objects are so important, we understandably take it as a sign of encroaching madness that we can so easily misplace them. Hence, we oblige ourselves by going bonkers.

Not to fear though, for the fault lies not in some mental defect, but rather in a mental asset, our memories. If we had to remember all the stuff we think and do, our little brains would soon freeze up, and end us up as idiot savants babbling forever all the places where we put our lint. No, it is a sign of intelligence to forget things, but forgetfulness has its own logic. Important events need a little time to, well, sink in, and our hurried existence does not give us the few seconds or so to contemplate the possible significance of every single action. So, while thinking about something else, we deposit our remote, keys or other object in a place that even Indiana Jones would find difficult to excavate. Similarly, when we are introduced to people, we immediately forget their names as our attention is directed to other niceties of conversation and etiquette, and end up feeling foolish when later we haven't a clue whom we're talking to.

The solutions alas are as obvious as they are hard to employ in practice. It's hard to slow down to smell the roses, contemplate the remote, or count the keys, and it's a chore to secure all our valuables in their own easily accessed parking garage. But we must try, at least to mitigate our emotional agonies, and the irritation of friends, coworkers, and spouses who question our sanity when we cannot find our stuff. It's a lesson indeed that I hope you will remember.