Tuesday, January 30, 2007

Some Thoughts on Information - III

This is taking a lot longer than I thought it would, and I'm still not to where I really want to be. So, tolerance, tolerance, please. Also, once into the nitty gritty here if I make a big boo-boo, please let me know.

I think we all agree that one of the most important events in life, maybe even the definition of sentience, is the reception, use, and ultimately dissemination of information (just gilding the lily on the dictionary definition of sentience.) So, the next thing to think about is how information gets to the human brain. Consider that, at the very root of this process, as, indeed the root of all biological processes, is a bimolecular chemical reaction. Hopefully, if I make it through this stage, that fact will lead to some interesting conclusions, if not many more questions, about information and its uses.

What follows is extremely simple at the basic level. The details (like the details of our discussion so far that seem to branch out further and further as the Mandelbrot set) become unbelievably complicated, e.g. the physics of photoisomerization. Please ignore the details if you wish. I will try to bring it to a very simple scheme at the end. Throughout, I have used Wikipedia. It is the current library of Alexandria.

First, consider a human who is in a tank of water that is at a temperature that is completely neutral (skin temperature). This person is able to breath with an apparatus and there is no light or sound. In other words, consider a person in an “Isolation Tank.” For short periods, this can lead to an enhanced state of self awareness, creativity, etc. However, after prolonged periods, the subject can experience anxiety, depression and, I suspect, just go crazy. (That sensory deprivation has been used routinely in the detention of “unlawful combatants” at Guantanamo and elsewhere is one more crime by the Bush administration, because it is certainly tantamount to torture.)

The point here is that the human brain has to have sensory input to function in a normal manner. In other words, it needs information. Without constant information input, the human brain and mind would go off on some wild tangent from reality. So, it makes a lot of sense to examine in detail how information gets to the brain and, by extension, how it is handled once there.

I would like to go through the process by which a visual sensory input (photons) makes it to the brain. Again, please try to see this in its simplest terms, since the devil is not in the details (except one, which I will come to).

It is of extreme interest that visual sensory input is quantized into photons. Please recall that Einstein won the Nobel Prize for this discovery, and the photoelectric effect, that he published in 1905 (along with the theory of special relativity!). While light behaves both as a wave and a particle (a photon), it is the latter that is of most interest here. In spite of this “quantization” it is my view that quantum mechanics plays no real part in the mechanism that we are about to discuss. We could talk about this at length later. Indeed, Einstein himself never really accepted quantum mechanics since it could not be brought into his general theory of relativity (i.e. couldn't explain gravity.)

I will try to show that at each stage of the process the information is digitized at the most basic level. This has profound implications for how we process the information and, ultimately must have some reflection on how we think. It also has a real world prediction about free will (don't choke, please).

Figure 1 - Structure the eye and retina

Figure 2 - Structure of the retina

First, consider the retina. It consists of multiple layers (see figures 1 and 2) with the rods and cones, the photon sensors, at the bottom of the layers (this is arrangement counter intuitive since you would think the photon receptors would be the first thing that a photon encountered). The rod cells of a human eye are capable of reacting to a single photon! That is, they are as sensitive as the Hubble Telescope since this is the ultimate limit of electromagnetic information.

In the cell membrane of a rod cell (cones are similar but are used for color perception) is a molecule called opsin (Figure 3).

Figure 3 - 3D structure of opsin

Buried in the center of opsin is a molecule called retinal (tiny red figure in Figure 3 and Figure 4 A). This is actually Vitamin A and, of course, the favorite of Bugs Bunny (carrots). It is also, sadly, very low in the diet of some third world children leading to blindness. We give it as a supplement to infants. Too much, though, is toxic. It is related to Retin A for acne, and, most interestingly, to the treatment of acute promyelocytic leukemia with all-trans -retinoic acid. How this causes differentiation of the leukemia cells to mature, non malignant cells probably involves G proteins, but I digress.

Figure 4A - Cis-retinal and 4B - Trans-retinal

When a photon of light hits retinal, it can be absorbed, causing the photoisomerization from the cis configuration to the trans configuration (Figure 4A -> 4B).

How exactly this is accomplished appears to be quite complicated. It can be understood in various ways. Since I am an aficionado of molecular orbital theory (vs valence bond theory) I prefer to think of it as the photon exciting an electron in the bonding orbitals of the retinal molecule to a higher orbital. Once in the higher orbital, the molecule is more stable in the trans configuration. However, if you are really interested, fly to here. This isomerization takes place pretty fast, somewhere on the order of 200-500 femtaseconds.

In the trans configuration, the retinal molecule doesn't “fit” into the opsin protein, and this causes a G protein signaling pathway to be activated. I am not sure of the exact mechanism of the pathway activation.

A nice animation of this process can be found here.

G proteins are ubiquitous in metabolism as “second messengers.”

The G protein in the rod cells is Transducin. Transducin contain both G-alpha and G-gamma-delta subunits. Activation of Transducin by the conformational change of retinal causes GDP (guanine diphosphate) to be exchanged with GTP (guanine triphosphate) from the cytosol at the G-alpha subunit. In turn, the now activated G-alpha subunit dissociates from the G-gamma-delta and increases the activity of cyclic GMP phosphodiesterase. This enzyme closes ion channels to passage of Na+ and K+ causing hyperpolarization of the cell membrane.

To be continued.....................

Sunday, January 28, 2007

Odds and Ends

I am really working hard on the next "Information" post. I've gotten hung up on action potentials but, for anyone interested, I'll have it up later this week. I had to review such simple things as electrochemistry.

The Growlery has a couple of posts that bear commenting:

The source of the word "limey" is indeed from the practice of eating limes for vitamin C by the British Navy. I didn't mean it in a scurrilous tone by any stretch. While all the other sailors were getting the handle "scurvy bastard" attached to them, the Brits were drinking gin and tonic (quinine - malaria prophylaxis) with, of course, limes. There you are.

As for Pommie, I always heard it as "Pommie Bastard" when I lived in Ireland. But this was so long ago that it was at the beginning of the last "troubles" in North Ireland and it was used by the "Jackeenes" in Dublin. They're the ones who've lived in Dublin since before the Vikings got stomped by Brian Boru at Clontraf. I always thought was a reference to that little "pom" on the top of the British soldier's cap. I accept that it might have arisen in WWI with the influx in Europe of all those Australian troops fighting the "bloody Turks" and losing their legs as in "The Band Played Waltzing Matilda." Pomegranate sounds fine to me.

Aside stimulated by WWI and British soldiers: My college roomate's grandfather lived in Boston (on Beacon Street.) He was a doc in the British Army in WWI and had been in a mess tent in Egypt when T.E. Lawerence came rolling by. Small world.

Oh, about your Prince:
Prince Charles has been reminiscing about living in Australia as a teenager, saying he hiked for 70 miles and was called a "pommie bastard".

Here's another thing from the Growlery:
This question of symbols is one to which Jim Putnam periodically returns, for example here and more recently, less explicitly but just as certainly here ... it's a fascinating one, and goes to the very core of every area of human activity. Every one of us holds within her or his mind a model of the universe, and that model is built from symbols.
This whetted my appetite since there is no doubt that the internal workings of the mind must use symbols in some nuanced way and I don't, at the moment, have a biochemical explanation for this.


As for the comments by the Growlery re education and information, I wish I had an eon to reply. I can see I would have to sharpen the pencils since he has got the hands on experience.

Hat tip to Jim Putnam and his recent comments here. Again and again I go back to the scenario of "what should a good German have done." Norman Mailer's new book is about Hitler's early life; essentially an attempt to understand how someone could become such a monster. It may be that he was born that way, but certain experiences shaped him in his course. Are we not shaping the experience of children both in Iraq and, to a lesser degree in the West by perpetrating an endless civil war? What will be the result on the emerging Iraqi population? What will a young man of 5-10 years old be like in 15-20 years after having lived through this carnage. Has it not radicalized the Lebanese and Palestinians? Where does Ha mas and Hezbollah get its recruits except from this pool of radicalized young men, and, increasingly women?

We have much to ponder.

Friday, January 26, 2007

Friday Crab Rockfish Blogging

Slightly Limey?

Wednesday, January 24, 2007

Quick Comments

I feel very honored to have another blogger commenting on some of the things the Growlery and I have been throwing around. It just amazes me that the closer you look at even the most simple question the further it spreads out into even more interesting fields, like a Mandelbrot set. Take this remark:
In a previous life, my job was to assist clients to develop information systems. Reading Felix and Dr. C., I now become aware of how we limited our definition of information. Information, when designing a computer system, was data that was not previously known. In other words, our computer information systems allowed the manager to view data in such a way that some bit of data was used in a manner that allowed the manager to make better decisions.
Actually, I find this a provocative definition: "data that was not previously known." It casts information into a utilitarian role. Who would dispute that except perhaps an aesthete, i.e. one who valued information just for the sake of information. Perhaps an art connoisseur, or cellist. But then, you might rejoin that someone who just wanted pure information (if there is such a thing) was a dilettante.

You see what I mean, its like opening Pandora's box.

On another note, the Growlery has taken issue with my disparagement of Intelligent Design. That is a discussion that needs to be had, but I beg off until we get further into our current doings. However, I accept full sackcloth and ashes for belittling Intellectual Freedom even the slightest bit.


I suspect that we could go on talking about localization and dispersion of information for a long time and not even scratch the surface. It was interesting to follow the course from the Neanderthal, where almost all was localized in the individual, with some shared in the immediate community, to Homer where it was further dispersed, to the ages of libraries and print, and now to the age of the Internet. In each age the individual human has varying levels of on board information. As the Growlery points out, with A.I. robots coming on the scene, the delocalization of information is even more complicated.

But, we still have to get information into the human. From my standpoint, there are a number of interesting aspects of this process which we can go into in the next post.

Exactly how much information can someone know? How much should a person know (onboard v.s. access via, e.g., the Internet)? Is there a threshold amount of information after which the brain starts kicking things out the other end (wherever that is)? Furthermore, do we really forget things that we have learned or are they just non retrievable? Once you start asking these questions you can’t stop. Maybe I can make a stab at it after we develop information intake.

Aside: I am impressed that pedagogy spends little time teaching children facts anymore, at least as we were taught facts only 50 years ago, but I am not sure what is taught in its place. For instance, do they even teach the times tables anymore with the advent of hand held calculators? What about all those poems that people used to memorize. God forbid that I should ever have to go through medical school again (a recurring nightmare) and face that memorization gauntlet.

My staff, a number with children, informed me yesterday that they do not teach cursive handwriting anymore and that even in the early grades they learn keyboarding (which rhymes with skateboarding and, of course, snowboarding.) Children when they have to “write” by hand use printing. Sounds like regression to me. I suspect they will next be knocking out cuneiform script on clay tablets.

Saturday, January 20, 2007

The Dying Children of Iraq

I promised not to blog on Iraq but I just can't stand this:
"Sick or injured children who could otherwise be treated by simple means are left to die in hundreds because they do not have access to basic medicines or other resources," the doctors say. "Children who have lost hands, feet and limbs are left without prostheses. Children with grave psychological distress are left untreated," they add.

They say babies are being ventilated with a plastic tube in their noses and dying for want of an oxygen mask, while other babies are dying because of the lack of a phial of vitamin K or sterile needles, all costing about 95p. Hospitals have little hope of stopping fatal infections spreading from baby to baby because of the lack of surgical gloves, which cost about 3.5p a pair. (emphasis added)
Yes, that's 3.5 pence, or 7 cents!

In the meantime, we are spending $2,000,000,000 a week killing Iraqis. The total cost of the Iraq "War" if over a Trillion dollars. Yes, that's $1,000,000,000,000. This would buy over 10,000,000,000,000 pairs of gloves. Ten trillion pair of gloves.

And I sit here doing nothing.

(Oh, and "Up to 260,000 children may have died since the 2003 invasion of Iraq." If America was Iraq, that would be over 3,000,000 children. That's three million. I can't even think about these numbers without getting ill.)

Friday, January 19, 2007

Some Thoughts on Information - II

We got as far as Homer with our thinking about information.

The reason I picked Homer is because, to my simple mind, he marks the transition for an oral tradition of information passage to a written version. When written language developed exactly is hard to determine (for interested parties, go to to the Wikipedia entry on the History of Writing).While there are surely sagas, stories, information (such as that found in the Egyptian pyramids) that had been written (using various hieroglyphic and cuneiform languages) prior to ~1200 BC, in the Western tradition, we credit Cadmus with bringing the Phoenician alphabet from Tyre (bombed to smithereens by the Israelis last summer) to Greece and introducing the written word there (as described in “The Marriage of Cadmus and Harmony” by Roberto Calasso.)

After this, information was written. At first cumbrous by hand, reaching its apogee in the scripting monks of the wee islands off the southern coast of Ireland (see “How the Irish Saved Civilization” by Thomas Cahill and, if you really want an experience, go see the Book of Kells at Trinity College in Dublin). This was a learning experience for humans, translating the spoken, observational world and mapping it onto the written page. This is the essence of writing, a mapping process. Interestingly enough, it is exactly what DNA does when it maps information onto the milieu of the cell. (To let you know where I am driving in these spiels, DNA maps its information onto proteins that have specific, kinetic rates. The proteins have primary (i.e. amino acid structure), secondary (i.e. how the amino acids are arranged determined by the code), tertiary (how the protein folds in three dimensional space) and quaternary (the rate of the reaction that the enzyme performs. Much more about this later, if you are interested, later.)

So information important to human survival is mapped onto the page by writing. Then, after Gutenberg, by printing. (Project Gutenberg is interesting in its quest to get as many books as possible onto the digital library of the Internet).And in the late 20th century by the digitization into ones and zeros.

This reminds me a little of the Jorge Luis Borges’ tale where the king asks his cartographer to map the kingdom and he makes a small model. Next he asks him to make it larger. If you know Borges, you can figure out what happens next. Gradually the map (model) gets bigger until it encompasses the whole kingdom. What I mean is that the Internet is digitizing our lives, at some point entirely. We don’t need to travel to distant stars. We can just send all the information necessary in digital form and another civilization will reconstruct us. DNA and all.

Now where is this information localized. Growlery emphasized that Homer’s information space was across Greece, several hundred kilometers. I would certainly agree with this but would also add that the next big thing after with written word was the localization of information in libraries, particularly the Library at Alexandria. It must certainly rank right up there with the seven tragedies of the ancient world the destruction of that library in that this was the largest collection of texts (many on papyrus) ever assembled until that time. It may have reached as many as 500,000 scrolls. (Laura Croft has an encounter with the Library in one of the Tomb Raider adventures, I forget which.)

As an aside, Egypt has recently decided to open a library to commemorate the ancient library. Do we not find an intense irony here that the written word is now digitized so that, rather than a library, we could store this on a few CD’s, or DVD’s, or Blue Rays, or whatever? (The library I now use is eMedicine, Google and Wikipedia, although I have a home library of 2-3,000 books. I just like to sit amongst them. Cheap thrills.)

So, back to the thread. In spite of the exponential increase in information, much of it localized in books, then books in libraries, one has to ask how the single human progressed in this. Did all this information make life easier for the human, or, paradoxically more difficult. And, of course, how do we get and process this information.

Stay tuned.

Friday Crab Blogging

An Oldie, but Goldie.

Further post on information this weekend, if anyone is interested.

Tuesday, January 16, 2007

Commenting before moving forward;

The Growlery has two posts up (here and here) with some pithy comments on what we discussed below. As I had suspected, there are large numbers of people who are thinking along the same lines, sort of like a “swarm.” Here are a few extracted memes:

Concerning Information handling in the human:
Now, though, I did think about it. And, she (his dentist talking about his tongue) pointed out, all of this is just one small subroutine within the total body monitoring run by the central nervous system - which is, in itself, only one function of that system amongst many others.
This comment is interesting from a number of points, not the least that it suggests that human processing of information is a massively parallel affair. This coincides with current ideas of neuroanatomy with the “executive” center being in the prefrontal cortex and the areas such as Brocca’s speech center (having been described several centuries ago) and the motor cortex being ancillary to this directive. Of course we don't really understand how it works yet.

Concerning Intelligent Design:
Human beings, as I said in an earlier post, are a package evolved by evolution (or, if you are from the other shop, designed) to meet a very different world from the one within which robotics are developing.
I disagree with the Growlery on tossing a bone to the intelligent design crowd. While they pretend to be scientists, their agenda is clearly political and it pains me that we have to continually recognize their influence. Hopefully it is waning.

Concerning Lagrange points L1 an L2
I looked them up in Wikipedia. It took me less than a minute. Talk about information handling.

Concerning deposits of information in robots:
...to duplicate them onboard is unnecessary, unless security or the transmission delay is unacceptable for a particular purpose
I think I would like to say something about this in the future. Something about the fragility of information transfer (upon which all these visions of robotic swarms heavily depend). Have you ever worked in a Hospital?

Concerning the robots of the future:
It's not impossible that dentistry, like radiography, will one day be carried out by machines; but not by a humanoid robot.
This is very poignant. It just like the American auto worker in the 50’s and 60’s looking forward to the future. Little did they know that their jobs would first be shipped over seas and then be taken over by automation. I don’t think that doctors (or lawyers, for that matter) have even considered what is going to happen to their professions when EMR’s (electronic medical records) get fully implemented. In fact, they are clamouring for my office to implement a prototype information collection on all patients with respect to their mental health that will be shared by all health care professionals!!

Talk about Brave New World (a Shakespeare quote, by the way. Never to be replaced by a robot.)

Concerning the definition of “living”:
When will a machine, robot, automaton, be deemed “alive”?
I would say that this is purely a philosophical question akin to “how many angels can dance on the head of a pin.” It is interesting if you read Wittgenstein to see how many of his “questions” about language are equally irrelevant a 100 years later. We should return to this after an appropriate science of neuro-informatics has been developed. Consider that working on these problems is just like Steven Weinberg and his theory of everything.

Next the Growlery refers us to five separate comments by Babbage and Lovelace. These appeared in 1998-1999 but are very relevant. The first of these is here:

What follows are just some quotes out of these commentaries that I found interesting:

Concerning “swarms” of automatons (I like this word in this context better than the word robot):
…It is not, he feels, a very great leap to imagine a linked ‘swarm’ of such vehicles, comprising, like ants, a greatly flexible, dispersed entity. The loss of a single automaton would not fatally injure the swarm, which could repair itself or even manufacture complete new vehicles. Perhaps such swarms are the future of planetary exploration and much else besides.
I think that this is a comment on the definition of life. What else is an organism but something that can “repair itself and manufacture copies of itself?” Of course that covers the Andromeda Strain and I despise Michael Critchon.

Returning to this question again:
Human sentience, it has been argued, derives from manipulative capacity and a critical number of neural connections. How large must a swarm be, Babbage wonders, to provide that critical number? He finds the question fascinating; but with the prospect of automata free to decide to copy themselves or to kill, Mary Shelley’s warning about irresponsible scientific creation carries as much weight today as it did when Babbage, as a young man, first read Frankenstein.

The Pope does Computers:
Lovelace turned to the radio, where she heard that the Vatican reputedly plans to computerise "the search for God's fingerprints in the chaos of Creation".
What an enticing little tidbit. Except, as they point out, how do you decide what is a fingerprint of God?

Dance, angels, dance on that pinhead.

On the fundamental constants of the universe:
that the fundamental constants of the universe - seemingly tailored to our existence - may also have evolved Darwinistical
I am not a physicist, but this seems to be contradicted by the fact that, as time goes on, we are able to reconstruct the universe further and further backwards in time. I am almost certain that this presupposes that the constants don’t change (wouldn't be a constant, would it). There is much speculation about these constants and, of course, the possibility of alternative universes comes up. It is my understanding that humans couldn't’t live in an alternative universe simply because only these constants only allow life as we know it.

Finally a comment on robotic surgery:
Of the medical endorobotics experts Babbage spoke to, some foresaw automated surgery useful in battlefield or other hazardous environments. Others saw brighter prospects, a way out of resource shortages in a world of escalating clinical demand. But whatever the applications for daVinci and the daVinciettes, their increasing part in medicine seems to Babbage inevitable.
Did I just see my job go down the drain? This is a long time in the future, but it is probably a real possibility. What will humans do when we are all replaced by robots. Oh, you Ludites, if you only knew!

Thursday, January 11, 2007

Some Thoughts on Information

Some thoughts on the compartmentalization and dispersion of information in the biological world in response to questions posed by The Growlery.

This is such an interesting topic, and it ultimately reflects on the larger question of human knowledge, that it is certainly worth pursuing. Let me say from the outset that I am in no way an expert on these matters. I prefer to be viewed as a curious layman. And, I am sure, that people like Daniel Dennett have explored these issues far more deeply than I could even imagine (I have two of Dennett’s books and another on consciousness that I have yet to read). However, to just paraphrase what another writes, no matter how elevated, isn’t half as fun as thinking about it yourself. That’s why I like the Internet, it allows this kind of thing for a while (until the Korporate Takeover).

When was the first information necessary for an organism to survive? Well, it must have been pretty early in evolution that informational molecules were required. If we look on DNA as the current and ultimate repository of biological information, some prototype informational molecule must have been present from the word "go" to direct reproduction. If we had time travel, this is the first thing I would find out. What was the ur-molecule of life? Some have speculated that it was RNA, or an RNA precursor, but I think that unlikely. Even RNA, with its backbone of sugars, is much too complex. In any case, the necessity of information for survival dates from the beginning.

It goes without saying that this is what separates the animate from inanimate. There is no example that I know of where the inanimate is driven by information from another source. The crystallization of minerals into their complexity simply follows the second law of thermodynamics with the atoms settling into their respective potential wells in energy space. (This does mimic the organization of kinetic space in cellular metabolism but that is directed by DNA. More on this later.) In any case, the inanimate does not evolve.

Of course DNA and RNA are now the basis of all life, including pseudo life such as viruses. One has to wonder at how and why these molecules came to hold the position they do. In addition to being a very slick means of both reproducing and directing cellular events, the latter by taking information stored in the genetic code to direct the formation of proteins, these molecules are also easy to mutate. Without mutation, there would be no evolution. But mutation changes the information. Most mutations destroy information in that they direct the synthesis of dysfunctional or afunctional proteins. But some mutations, the saints among men, are actually “superior” or “new” proteins (and the use of the words “superior” and “new” here is very loose). They contain new information! (and “new” here is very tight). That is, the configuration of DNA base pairs is different, and this is new information. Of course, there is nothing new under the sun and some information has had to be “destroyed.” This raises the question if information can ever be destroyed. Once information has been present, even if the physical repository (here DNA) is it possible to destroy it forever? If the DNA for unicorns was once present (just like the DNA for DoDo birds and 99% of the other species that have ever existed including dinosaurs) does that mean that it never disappears, even if the information is not in real space? A mathematician must answer this question.

One should not dismiss lowly organisms such as bacteria from having sophisticated information handling (and needs). In addition to bacterial DNA (several have had their total sequence determined) there is bacterial phages, or viruses (virii?) that infect bacteria. In addition to being useful in the new science of gene therapy (getting DNA into human cells) they also carry information between bacteria in the process of infecting and destroying the bacteria (its every bug for themselves in the nether world.) This can be particularly nasty for humans because when a bacteria mutates its DNA to become resistant to an antibiotic, that new information can be relayed to all the other bacteria in the world in very short order by the intervention of bacteriophages. In this way, staphylococcus aureus has become resistant to almost all conventional antibiotics causing serious amounts of trouble treating infections with this common bug (the scourge of MRSA, of methicillin staph aureus).

Even inside the bacteria the information becomes more or less localized. However, it is not until prokaryotes evolve to eukaryotes, and we have a nucleus in the cell that we have true localization. At that point, we need messenger molecules to take the information out of the nucleus (messenger RNA) to the site of protein synthesis (ribosomes). Almost like a cellular internet.

In any case, while one is tempted to relegate “information” to higher organisms and even restrict it to humans, the reality is that all of life is based on information. However, as the Growlery points out, there is a hierarchy of information placement, though he restricts it to the human world:
… a tribal system of oral history and a Googlised internet are both points on the same continuum, albeit separated by an arc of accelerating complexity curve.
As organisms get more complex, information, information storage, and information access becomes more complex. At some point, information begins to be stored outside the organism. Now, to be sure, one can never separate an organism completely from its environment, everybody needs a Petri dish. We would have no viruses to taunt us if there had never been bacteria to harbor their growth. But we wouldn’t be able to eat steak either, since the intestinal flora help in digestion, including absorbing vitamins. What we mean here is that organisms extract information from the environment in order to exist, but I am not exactly sure how to say this. Is there a difference between my Googling “phage” on the Internet and a bacteria sucking up the phage du jour? This needs a lot more thought. What exactly do I mean by “extracting information from the environment” and how does this differ from just receiving sensory input. I know what I mean but I can’t express it. There is a difference.

Fast forward to mammals who walk upright. I have not read “The Singing Neanderthals: The Origins of Music, Language, Mind and Body” by William McNeill yet. I did read the NYRB review. It is an intriguing set of hypotheses but, ultimately, an exercise in metahistory. We cannot know what really went on in the camps of Neanderthals. However, it is a good bet that they did develop verbal communication. Most important here is that communicating by sound may have resulted in the establishment of a repository of information that, while not physically outside the individual, may be described as cultural or collective knowledge. Now, immediately you will put forward the objection that this is no different than the “instinctive” knowledge of bees in a hive or ants in a hill. Certainly those insects function with a certain amount of community information, and, they communicate. Ants with their pheromones and touch, bees with their visual dance.

But, somehow, once the Neanderthal started to collect information as language, I think we entered into a new era. You might disagree with me here. What we do know is that there were many thousands of years of oral tradition between the “Swinger” singers and the oral tradition of Homer. On the scale of true evolution, of course, it was miniscule (see the time line of evolution that was circulating on the Web in the last few years to counter the Biblical time frame.)

It is amazing, though, that Homer and the Neanderthals were both at the same level in terms of information. Of course aesthetically (a rather subjective thing) they are miles apart. It is interesting that the aesthetic appeal is due to complexity since there is no difference, basically in the level of information between apes singing around a fire and Greeks singing around a fire. This, of course, raises the question as to how complex the information can get? What is the upper limit? Is there always an upper limit and have humans reached the upper limit for our current brain?

So, next to take up information after Homer. It seems like the surface hasn’t even been scratched.

Tuesday, January 09, 2007


Tom Tomorrow's cartoon (which is always poignant) took a swipe at doctors today in the process of pointing out the absurdity of health care in the U.S. It makes me sad to see that someone as aware as he is about political events seems confused about what is going on in medicine. Doctor's have nothing to do with offering insurance, we only slave under it.

Dear Tom,

Doctors are not responsible and do not support the current fiasco that is health insurance. In fact, insurance companies are our bitterest enemy. They consume untold hours of staff time and pay only 50-70% of what they are billed. When is the last time you went to a gas station and paid 50% of what was on the pump?

Dr. C.

P.S. The CEO of United Health Care made $38 million in 2002.

Further Insanity

A U.S. attack plane killed many people with barrages of gunfire in a remote Somali village occupied by Islamists thought to be hiding at least one al Qaeda suspect, a Somali government source said on Tuesday.(emphasis added)
This is just wrong, wrong, wrong. Yes, there might have been an al Qaeda suspect. But there were innoncent people killed.

This makes it murder.

Monday, January 08, 2007

A small agenda

Being a little overwhelmed by the response over at The Growlery to a post I had below. Let me see if I can summarize the action in this flick to date:

1. Observation was made of the finding of random statue fragments (I almost said statutory) and the suggestion that information technology, if it can put together the Martian landscape, should be able to piece together the puzzle of a few dozen fragments. After all, a human can put together a 500 piece jigsaw puzzle pretty quickly and computers can beat the world champions in chess. I had been meaning to read The Growlery’s article on information processing and, indeed, he did reference this type of puzzle solving but then went on to mention some much more interesting things. I will switch from the terminology "knowledge" to "information" though they may be the same thing.

2. These interesting things included the fact that many organisms, including in large part human beings, must carry their information around with them (we all come with 10,000 genes for smell receptors; 10,000 pre programmed odors!!; gene knowledge and learned knowledge; classic platonic conundrum), that robots may be largely free of such restrictions. While the immediate needs of recognition may need to be on board the robot, a large amount of information in a robot may be stored off site. For example, as the Growlery points out, your personal robot might store your culinary preferences on the moon (as long, of course, as there was “instant” access through, I presume, a rapid satellite bluetooth connection.)

3. The first topic that we need to explore, then, is this idea of information localization. There are a number of fascinating aspects of this including Wikepedia, the fact that the Library of Congress could be put on a small number of CD’s, the emerging paradigm in medicine of information handling (are radiologists obsolete?) and, finally, the concept of human knowledge itself being located off site as in “cultural” or “tribal” knowledge. Pertaining to this the Growlery says:
Survival through all those millennia of pretechnological existence made us what we are: a fixed package of onboard facilities with expansion taken care of by social groupings.
Please take note of the social groupings, i.e. tribes. Is there dispersed knowledge a la Carl Jung? Of course humans now have Google which might make tribes, in the sense of Diamond’s progression of socialization (c.f. Guns, Germs and Steel) also obsolete, along with radiologists.

4. Once we come to some general conclusion about the localization of information (is not "knowledge" just "information" in context?) we need to address the actual procurement of knowledge by the human organism. On this, we will have to rely on some concepts of neurophysiology but will also need to extend it to the molecular level. This will involve a discussion of the “threshold” hypothesis of human action. I can’t wait.

5. With the tools developed in 4, then, and only then, can we address the concept of Free Will from a truly physiologic and molecular way. Even a causal survey of current human activity (particularly on a moral level) will show that if we come to the conclusion that humans do not have “Free Will” in any shape or form, then the entire society from top to bottom has to be rethought (including crime, including Saddam, etc., etc., etc.). Mighty big order, but even more interesting than 4.

Saturday, January 06, 2007

Barbara Boxer the Ecumenicist

I was really struck by this:
Boxer said legislation on climate change will be her No. 1 priority as committee chairwoman. She plans to conduct hearings early next year with scientists, environmentalists and religious leaders who want to address climate change. She also will ask business leaders to testify about their efforts to limit greenhouse gases.

Religious leaders? What? Are we to include the local Swami? Maybe Pat Roberts and Billy Graham (though his kids are arguing about where he's going to be buried).

What does a religious leader have to say about the environment other than to take up our valuable public time arguing about intelligent design.


Monday, January 01, 2007

Bush and the Constitution

I just saw the video on Firedoglake of Olbermann talking to about the Constitution with a Professor of Constitution Law at GWU. Pretty sad. Pretty freightening. My reply.

The puzzles of Keros

Now here is a puzzle for any mathematician who reads this:
Statues offer clues to Greek isle's past
By NICHOLAS PAPHITIS, Associated Press Writer
Sun Dec 31, 4:19 AM ET

ATHENS, Greece - Unlike its larger, postcard-perfect neighbors in the Aegean Sea, Keros is a tiny rocky dump inhabited by a single goatherd. But the barren islet was of major importance to the mysterious Cycladic people, a sophisticated pre-Greek civilization with no written language that flourished 4,500 years ago and produced strikingly modern-looking artwork.
...Keros is a repository of art from the seafaring culture whose flat-faced marble statues inspired the work of 20th century masters Pablo Picasso and Henry Moore.
When they were unearthed, the white marble shards were jumbled close together like a pile of bleached bones, an elbow here, a leg there, occasionally a head. (emphasis added)
So, here's the challange, is there a program that solves a puzzle like this? That is, one that takes a jumbled mess of fragments and reconstructs entities? I would assume that there are puzzle solving programs just as I am sure that the topologists are all over this like chicken pox (ah, something I do know something about).

Oops, now that wrote all the above I see that Felix Grant has allready talked about the problem in his article on pattern recognition.

And, while I'm at it, let me throw out this quote from Dr. Grant's article:
..... a robot need only perform onboard perception and cognition where speed of response requires it – roughly the equivalent of instinctive functions in a biological organism.
It is my opinion, and I plan to spend a lot more time on this, that all functions of a biological organism are "instinctive." Actually, I would put the word "reflexive" there.

Furthermore, I contend that this includes all of the so called "higher" functions of humans up to and including Free Will. Take that you intelligent designers.

(Oh, and by the way, pattern recognition will be the death knell of radiologists.)

(And, of course, read Dr. Grant's article, it is very interesting in separating out the physical locale of knowledge. More on this and medicine later.)