Friday, August 31, 2007
Thursday, August 30, 2007
Attacking Iran - The Ultimate Folly
There is ample evidence that George W. Bush intends to attack Iran. The New Yorker article from April 2006 by Seymour Hirsch, The Iran Plans, was a wake up call for many of us. (We should have taken Mr. Hirsch very seriously since he has frequently been right about these things. He was particularly prescient about the current disaster in Irak.) More recent studies and articles have suggested that the Iran attack will come before the end of the Bush presidency, maybe even as soon as this fall.
Considering a war with Iran:
A discussion paper on WMD in the Middle East
by Dr Dan Plesch and Martin Butcher
September 2007
(Dr Plesch is Director of the School of Oriental and African Studies’ Centre for International Studies and Diplomacy (London). Martin Butcher is an international consultant on security politics.)
There seems to be little concern in our Main Stream Media (MSM) about this horror. While historical precedent is overwhelming in its judgement that superpowers court disaster in their hubris (e.g. Barbara Tuchman: The Guns of August, and The March of Folly: From Troy to Vietnam), almost every politician of note, including the current frontrunner Democratic presidential candidates, has been rattling their swords along with the most obnoxious neocons. Who can forget McCain's refrain: "Bomb, Bomb, Bomb; Bomb Bomb Iran" to the tune of the Beach Boys "Barbara Ann." (I don't know about you, but in my pantheon, this is a sacrilige.)
The worst scenario of all is that the Air Force, given the mission to destroy Iran's hardened nuclear facilities, would have to resort to tactical nuclear weapons. And, if our Air Force didn't do so, Israel would do so. In the Mid East chaos that would follow the first attack, there is no doubt in my mind that Israel, with its two hundred nuclear weapons, would feel "threatened." There is also no doubt in my mind what would then happen to israel's neighbors, nuclear or not.
Is Iran a credible nuclear threat? Let me just suggest that you review the history of the runup to the invasion of Iraq if you doubt that what is going on now is not exactly the same thing.
And then there's:
IAEA: Iranian cooperation significant
By GEORGE JAHN, Associated Press Writer
Considering a war with Iran:
A discussion paper on WMD in the Middle East
by Dr Dan Plesch and Martin Butcher
September 2007
(Dr Plesch is Director of the School of Oriental and African Studies’ Centre for International Studies and Diplomacy (London). Martin Butcher is an international consultant on security politics.)
There seems to be little concern in our Main Stream Media (MSM) about this horror. While historical precedent is overwhelming in its judgement that superpowers court disaster in their hubris (e.g. Barbara Tuchman: The Guns of August, and The March of Folly: From Troy to Vietnam), almost every politician of note, including the current frontrunner Democratic presidential candidates, has been rattling their swords along with the most obnoxious neocons. Who can forget McCain's refrain: "Bomb, Bomb, Bomb; Bomb Bomb Iran" to the tune of the Beach Boys "Barbara Ann." (I don't know about you, but in my pantheon, this is a sacrilige.)
Very simply, an attack on Iran would result in the deaths of thousands, if not millions, of innocent men, women and children.
Be very sure about it, such an attack on Iran would alter the life of everyone on this planet.
The worst scenario of all is that the Air Force, given the mission to destroy Iran's hardened nuclear facilities, would have to resort to tactical nuclear weapons. And, if our Air Force didn't do so, Israel would do so. In the Mid East chaos that would follow the first attack, there is no doubt in my mind that Israel, with its two hundred nuclear weapons, would feel "threatened." There is also no doubt in my mind what would then happen to israel's neighbors, nuclear or not.
Is Iran a credible nuclear threat? Let me just suggest that you review the history of the runup to the invasion of Iraq if you doubt that what is going on now is not exactly the same thing.
And then there's:
IAEA: Iranian cooperation significant
By GEORGE JAHN, Associated Press Writer
VIENNA, Austria - The U.N. nuclear agency said Thursday that Iran was producing less nuclear fuel than expected and praised Tehran for "a significant step forward" in explaining past atomic actions that have raised suspicions.
The assessment is expected to make it more difficult for the United States to rally support for a new round of sanctions against Tehran.
Wednesday, August 29, 2007
Data, data, everywhere, enough to drown a fish…..
The Growlery has linked to his article in Scientific Computing, Making child's play of power tools.
I found this article both interesting and stimulating. It is good summary of data-crunching software with particular emphasis on how big league programs have developed modules aimed at students. It raises a number of issues and it also brings back some memories to this feeble old mind.
Firstly, it got me to thinking about education, particularly scientific education, and how it has changed over the years, and then its extension into scientific research. Secondly, it got me thinking, once again, about kinetic data and computing. I’ll simply reminiscence about the first topic, but the second is nearer and dearer to my heart since it reflects back on our discussion of information transfer.
Reminiscences:
When I was in high school, Sputnik had just gone up. If you had the least aptitude for science or math, you were lavishly encouraged. I will admit, that we all thought this was the way the world was heading and found it inconceivable that we would ever live in a country whose President was scared of science and thought his intuition was better. Worse yet, a President who states that he talks to God. (if that doesn’t make an atheist out of you, nothing will.) Science was pretty near the most exciting thing out there. In the mid 50’s we got a chance to see a computer, first hand, and to try our hand at programming. It was the IBM 650 which was the first real workhorse for business.
Moving onward to undergraduate school. The university where I went had a lot of pre-med students (I wasn’t one then). They all had to take Analytical Chemistry with us Chem majors. While there were some electronic instruments available, this was mostly a course of titrating one chemical with another. The data was read off a burette and analyzed using logarithms out of the Chemical Rubber Company handbook. Thus, all computations were done by hand without benefit even of an adding machine. (Slide rules, which we all carried in belt holsters, were, of course, not precise enough for this work.) You were graded on accuracy. What you gained from this exercise, which should have been appreciated by the pre-meds (it wasn’t), was precision. Also, in the context of the Growlery’s article, one really, really, really got close to the data in a way that is impossible today. In this course we had to personally collect each little tidbit of information by reading the meniscus on a burette. Compare that to what surely must be the current situation. Now students are inundated with thousands of pieces of data collected by a machine. There is inflicted remoteness from the primary data source. I think you can argue that this trivializes data in a way, but I’m open to other views.
Further onward to graduate school. Computers were just in the offing. We had a big IBM 360 main frame, but it was hard to get time on it.
Graduate chemistry was an experience with an effusion of data. However, it still was collected in a very primal way. The most common was analogue data (peaks on a chromatogram, infrared spectroscopy, NMR, etc.) which we digitized by hand. That is, we measured distances and, believe it or not, cut out paper peaks and weighed them on a triple beam balance . At this time, during the late sixties, crystallographers went from determining their structures by reading diffraction film and measuring distances by hand, doing the calculations on a big adding machine, to automated machines that collected data on a Geiger counter that generated punch cards. The cards still had to be carried to the “big” computer (e.g. the 360) where the data was processed. But, in the end, we even got a drawn structure with the CalComp plotter. This was graphics in its infancy.
Was the exponential increase in quantity of data worth the removal of the researcher from the first hand collection of data worth it? I am sure that it was. None of the current technical advances would have been able without it (e.g. sequencing the human genome). However, there was something earthy in all those notebooks of numbers put down by hand that must be missing in this age of printouts and automatically generated spread sheets. The question becomes, has this had an effect on the possibility of a researcher making an insight? Can such data overwhelm and drown the investigation? This might be true particularly in the field of neurobiology where one is dealing with 100 Billion neurons.
Kinetics:
Much later, I encountered a problem with data that baffled me. It wasn’t a very profound problem, but it occupied me for years while my main job was doing medicine. One of the principal anticancer drugs used in oncology over the past 30 years has been cisplatinum. The story of how an inorganic complex got to be where it is in the treatment of cancer is fascinating, but I’ll leave it for a future thread. In any case, I had been puzzled by the fact that everyone believed that cisplatinum achieved its anticancer action by reacting with DNA. This just didn’t make chemical sense to me (it still doesn’t) since platinum complexes are much more reactive with sulfur compounds than the predominately nitrogen containing nucleotides of DNA. In any case, we did some experiments with cisplatinum and glutathione (one of principal sulfhydral containing biomolecules). To make a long story short, the reaction is a two stage reaction with the complex reacting with glutathione and then that product reacting further. We never identified the products, we were mainly interested in the rate of reaction. Sounds trivial, I know. Mathematically this two stage reaction is represented by simultaneous differential equations. Being a good Scout, I tried to solve them. It turns out that the kinetics of competing reactions is difficult to do in an exact manner.
We finally solved the problem by modeling it in Excel (actually Lotus 1-2-3, it was that long ago). We (re)discovered that any complex kinetic problem can be divided into smaller and smaller time segments (Newton strikes again). If the time segments are small enough, then one can have each reaction run sequentially, and use the product of the reactions as the starting point of the next, tiny, time segment. To determine the rate constants you had to manually compare the reaction curves. I think this is similar to what is called Euler’s method but, remember, I am not a mathematician. Eventually, I discovered that this technique had already been computerized for me in a nice graphical manner by the people at ScientificSoftware with their program Stella. I didn't see it in the list of programs discussed by the Growlery probably because it is a different field.
One of the first things that ScientificSoftware points out when introducing students to Stella is that the human brain has a very hard time predicting something in the future when there is more than one input (i.e. competing reactions). The example they use is wolves and deer in the forest above the Grand Canyon (the Kaibab Plateau). Predicting how the population of deer changes depending on the number of wolves and the birth rate of wolves and deer is hard enough. Throw in a rough winter, human hunters, etc. and it becomes impossible. Stella, on the other hand, solves the problem nicely. It is an excellent kinetic “what if” program. I guess this is the field of system dynamics. (I am also pretty sure that this is the way that computer games are programmed, so my tiny effort has been lost in an ever expanding super nova of Tomb Raiders and The Sims.)
Kinetic modelling definitely impacts on discussions we have had before. One of the defining characteristics of the human brain, as opposed to much smaller mammalian brains, is its ability to construct future scenarios. While we have just mentioned the limitations of theoretical projections, e.g. more than one kinetic input, in practice, in the every day world, humans make predictions on a minute by minute basis. Driving a car in busy traffic requires an awful lot of intuitive computing of trajectories.
However, as we have previously described, in the end run, all mental processes are reducible to basic electrochemical reactions. Millions of them, to be sure, but still there. This set of reactors (computers if you will) is presented with millions of pieces of sequential data from, for instance, the eye. The processing of this data must involve kinetics. For instance, I have just been reading about the period gene, which is a highly conserved gene in a group responsible for the internal clock of everything from the fruit fly to the human. This clock probably operates by having an oscillating concentration of a biomolecule, actually a protein, with high levels shutting down its production and low levels stimulating its production (by transcription of the per gene and/or generation of RNA)
Finally, it occurred to me when reading this paper that many of the examples were of static data. That it is very hard to perform statistics on moving data. Many of us are familiar with Kaplan-Meier plots of survival data in oncology. These are used to compare survival data for patients using various treatment regimens, +/- surgery and +/- radiation therapy. While diverging survival curves are frequently pounced upon as showing the superiority of a treatment regimen, such plots must be taken with a grain of salt because of the increasing statistical uncertainty of the data as it matures. One of the advantages of static data (I assume that most of the data referred to in the Growlery’s paper is static data) is that very sophisticated statistics can be done on this data.
However, for moving data, i.e. kinetic data, all sorts of new problems arise. Taking all the data at a single point one can apply well known statistical analysis. However, if that data moves on to a new time point, one cannot treat the data at that point as independent, since it is constrained in some way by where the data started from at the previous point in time. I have never understood how statistical analysis took care of this other than connecting dots one to another. Maybe these new programs can do this.
I found this article both interesting and stimulating. It is good summary of data-crunching software with particular emphasis on how big league programs have developed modules aimed at students. It raises a number of issues and it also brings back some memories to this feeble old mind.
Firstly, it got me to thinking about education, particularly scientific education, and how it has changed over the years, and then its extension into scientific research. Secondly, it got me thinking, once again, about kinetic data and computing. I’ll simply reminiscence about the first topic, but the second is nearer and dearer to my heart since it reflects back on our discussion of information transfer.
Reminiscences:
When I was in high school, Sputnik had just gone up. If you had the least aptitude for science or math, you were lavishly encouraged. I will admit, that we all thought this was the way the world was heading and found it inconceivable that we would ever live in a country whose President was scared of science and thought his intuition was better. Worse yet, a President who states that he talks to God. (if that doesn’t make an atheist out of you, nothing will.) Science was pretty near the most exciting thing out there. In the mid 50’s we got a chance to see a computer, first hand, and to try our hand at programming. It was the IBM 650 which was the first real workhorse for business.
IBM introduced a smaller, more affordable computer in 1954 that proved very popular. The IBM 650 weighed over 900 kg, the attached power supply weighed around 1350 kg and both were held in separate cabinets of roughly 1.5 meters by 0.9 meters by 1.8 meters. It cost $500,000 or could be leased for $3,500 a month. Its drum memory was originally only 2000 ten-digit words, and required arcane programming for efficient computing.This “memory” consisted, as it says, of 2000 addresses on a fast rotating drum. Data was read into it by punch cards (punch cards continued as the preferred mode of input up into the 70’s) and its output was on a teletype like printer. To program this computer, one had to place instructions on the disk in such a way that you took advantage of its rotation. That is, by the time the computer read the first instruction and executed it, the drum had rotated by, I think, three positions. The only thing the computer could do was add, subtract, multiply, and do logical tests like AND, EITHER, OR. Our final task in the introductory course was to program the 650 to do a square root. No hand held calculators here, just a $3 million dollar (2007 dollars) machine! I think we used Newton’s Method.
Moving onward to undergraduate school. The university where I went had a lot of pre-med students (I wasn’t one then). They all had to take Analytical Chemistry with us Chem majors. While there were some electronic instruments available, this was mostly a course of titrating one chemical with another. The data was read off a burette and analyzed using logarithms out of the Chemical Rubber Company handbook. Thus, all computations were done by hand without benefit even of an adding machine. (Slide rules, which we all carried in belt holsters, were, of course, not precise enough for this work.) You were graded on accuracy. What you gained from this exercise, which should have been appreciated by the pre-meds (it wasn’t), was precision. Also, in the context of the Growlery’s article, one really, really, really got close to the data in a way that is impossible today. In this course we had to personally collect each little tidbit of information by reading the meniscus on a burette. Compare that to what surely must be the current situation. Now students are inundated with thousands of pieces of data collected by a machine. There is inflicted remoteness from the primary data source. I think you can argue that this trivializes data in a way, but I’m open to other views.
Further onward to graduate school. Computers were just in the offing. We had a big IBM 360 main frame, but it was hard to get time on it.
Graduate chemistry was an experience with an effusion of data. However, it still was collected in a very primal way. The most common was analogue data (peaks on a chromatogram, infrared spectroscopy, NMR, etc.) which we digitized by hand. That is, we measured distances and, believe it or not, cut out paper peaks and weighed them on a triple beam balance . At this time, during the late sixties, crystallographers went from determining their structures by reading diffraction film and measuring distances by hand, doing the calculations on a big adding machine, to automated machines that collected data on a Geiger counter that generated punch cards. The cards still had to be carried to the “big” computer (e.g. the 360) where the data was processed. But, in the end, we even got a drawn structure with the CalComp plotter. This was graphics in its infancy.
Was the exponential increase in quantity of data worth the removal of the researcher from the first hand collection of data worth it? I am sure that it was. None of the current technical advances would have been able without it (e.g. sequencing the human genome). However, there was something earthy in all those notebooks of numbers put down by hand that must be missing in this age of printouts and automatically generated spread sheets. The question becomes, has this had an effect on the possibility of a researcher making an insight? Can such data overwhelm and drown the investigation? This might be true particularly in the field of neurobiology where one is dealing with 100 Billion neurons.
Kinetics:
Much later, I encountered a problem with data that baffled me. It wasn’t a very profound problem, but it occupied me for years while my main job was doing medicine. One of the principal anticancer drugs used in oncology over the past 30 years has been cisplatinum. The story of how an inorganic complex got to be where it is in the treatment of cancer is fascinating, but I’ll leave it for a future thread. In any case, I had been puzzled by the fact that everyone believed that cisplatinum achieved its anticancer action by reacting with DNA. This just didn’t make chemical sense to me (it still doesn’t) since platinum complexes are much more reactive with sulfur compounds than the predominately nitrogen containing nucleotides of DNA. In any case, we did some experiments with cisplatinum and glutathione (one of principal sulfhydral containing biomolecules). To make a long story short, the reaction is a two stage reaction with the complex reacting with glutathione and then that product reacting further. We never identified the products, we were mainly interested in the rate of reaction. Sounds trivial, I know. Mathematically this two stage reaction is represented by simultaneous differential equations. Being a good Scout, I tried to solve them. It turns out that the kinetics of competing reactions is difficult to do in an exact manner.
We finally solved the problem by modeling it in Excel (actually Lotus 1-2-3, it was that long ago). We (re)discovered that any complex kinetic problem can be divided into smaller and smaller time segments (Newton strikes again). If the time segments are small enough, then one can have each reaction run sequentially, and use the product of the reactions as the starting point of the next, tiny, time segment. To determine the rate constants you had to manually compare the reaction curves. I think this is similar to what is called Euler’s method but, remember, I am not a mathematician. Eventually, I discovered that this technique had already been computerized for me in a nice graphical manner by the people at ScientificSoftware with their program Stella. I didn't see it in the list of programs discussed by the Growlery probably because it is a different field.
One of the first things that ScientificSoftware points out when introducing students to Stella is that the human brain has a very hard time predicting something in the future when there is more than one input (i.e. competing reactions). The example they use is wolves and deer in the forest above the Grand Canyon (the Kaibab Plateau). Predicting how the population of deer changes depending on the number of wolves and the birth rate of wolves and deer is hard enough. Throw in a rough winter, human hunters, etc. and it becomes impossible. Stella, on the other hand, solves the problem nicely. It is an excellent kinetic “what if” program. I guess this is the field of system dynamics. (I am also pretty sure that this is the way that computer games are programmed, so my tiny effort has been lost in an ever expanding super nova of Tomb Raiders and The Sims.)
Kinetic modelling definitely impacts on discussions we have had before. One of the defining characteristics of the human brain, as opposed to much smaller mammalian brains, is its ability to construct future scenarios. While we have just mentioned the limitations of theoretical projections, e.g. more than one kinetic input, in practice, in the every day world, humans make predictions on a minute by minute basis. Driving a car in busy traffic requires an awful lot of intuitive computing of trajectories.
However, as we have previously described, in the end run, all mental processes are reducible to basic electrochemical reactions. Millions of them, to be sure, but still there. This set of reactors (computers if you will) is presented with millions of pieces of sequential data from, for instance, the eye. The processing of this data must involve kinetics. For instance, I have just been reading about the period gene, which is a highly conserved gene in a group responsible for the internal clock of everything from the fruit fly to the human. This clock probably operates by having an oscillating concentration of a biomolecule, actually a protein, with high levels shutting down its production and low levels stimulating its production (by transcription of the per gene and/or generation of RNA)
Finally, it occurred to me when reading this paper that many of the examples were of static data. That it is very hard to perform statistics on moving data. Many of us are familiar with Kaplan-Meier plots of survival data in oncology. These are used to compare survival data for patients using various treatment regimens, +/- surgery and +/- radiation therapy. While diverging survival curves are frequently pounced upon as showing the superiority of a treatment regimen, such plots must be taken with a grain of salt because of the increasing statistical uncertainty of the data as it matures. One of the advantages of static data (I assume that most of the data referred to in the Growlery’s paper is static data) is that very sophisticated statistics can be done on this data.
However, for moving data, i.e. kinetic data, all sorts of new problems arise. Taking all the data at a single point one can apply well known statistical analysis. However, if that data moves on to a new time point, one cannot treat the data at that point as independent, since it is constrained in some way by where the data started from at the previous point in time. I have never understood how statistical analysis took care of this other than connecting dots one to another. Maybe these new programs can do this.
Friday, August 24, 2007
Friday Crab Blogging
I asked this young Hispanic boy what his picture was about. His answer took me aback. He said "New York." I said, "What do you mean, New York?" He said, "You know, the World Trade Center...."
You could have knocked me over with a feather. Children have an amazing ability to see the unseeable.
This is one of my favorite crabs. It distills the essence of crab, the soul of crab, whatever...
Friday, August 17, 2007
Thursday, August 16, 2007
Coming Attractions
I hope to blog on the following issues in the near future:
1. The Growlery's excellent article in Scientific computing: Making child's play of power tools (I first encountered Mathematica in 1988)
2. The interesting exchange between Mr. Putnam and The Growlery over racism.
3. Some thoughts on Pediatric Oncology in Irak (gleaned from an interview with Maryam, an Iraki blogger on Gorilla's Guides, at FireDogLake) vs. the current treatment of Acute Lymphocytic Leukemia (ALL) in the U.S.
4. The ramifications of a world without Free Will. Why we never even think about this.
1. The Growlery's excellent article in Scientific computing: Making child's play of power tools (I first encountered Mathematica in 1988)
2. The interesting exchange between Mr. Putnam and The Growlery over racism.
3. Some thoughts on Pediatric Oncology in Irak (gleaned from an interview with Maryam, an Iraki blogger on Gorilla's Guides, at FireDogLake) vs. the current treatment of Acute Lymphocytic Leukemia (ALL) in the U.S.
4. The ramifications of a world without Free Will. Why we never even think about this.
What's $30 Billion among "friends"
Christmas 1914
This is a Christmas card from Germany, 1914. Although only 4 months old, World War I was turning into a bloody mess. On this, the Western Front, the trenches were full of men, water and, most of all, mud. It was indescribably awful.
Just think of the mind set that would create a Christmas card with the theme of "Peace" over the horror of the trenches. Then think of the same mind set that claims we are making "progress" in Irak in the 120 degree heat.
Still, on Christmas Day, 1914, soldiers in both armies laid down their arms and walked out into no man's land! The next day they got back to the business of slaughter.
I was brought to think about this strange episode in Man's warfare by this post of Mr. Putnam's which references in full another post on Spiritual activism, political revolution & New Age fuzzy thinking. Here is a quote:
In Bush's War on Terror, threatening to kill anyone who threatens us and torturing the people we capture have become standard operating procedures. The neocon solution to terrorism appears to boil down to one simplicity: Kill all the bad guys.And:
To believe that Iraqis and other Muslims are a different breed of human, that they they don't hate and fear and love like we do, is simply crazy. But that seems to be the neocon/Bush belief.Could it be that we don't know the first thing about human behavior? That humans, including Americans, operate on a much more fundamental level than we have been led to believe?
To say we learn nothing from History is a triviality. But we don't. At the risk of being iconoclastic, and given the conclusions drawn from the information thread, I am not optimistic. I really do not think that humans, particularly Americans, can control their (another over used term) destiny.
I guess it makes us feel warm and fuzzy that we can.
Friday, August 10, 2007
Friday Crab Blogging
Monday, August 06, 2007
The Bush Legacy
The sadness:
Suicide truck bomb kills 28 in Iraq
By HAMID AHMED, Associated Press Writer BAGHDAD -
A suicide bomber slammed his truck into a densely populated residential area in the northern Iraqi city of Tal Afar on Monday, killing at least 28 people, including 19 children, local authorities said. (emphasis added)
Friday, August 03, 2007
Subscribe to:
Posts (Atom)