The uneasy relation between philosophy and biology
Recently, I spat the dummy on sci.bio.evolution and declared that I would never post there again. This is a bit unlike me, I think, as I have been on Usenet for over a decade, and on reflection, I think I ought to make a public comment why I took this step.
The debate was getting a bit personal, and I was accused of being "pig-headed", "needing to go back to secondary school" and so forth, but such things are part of the rough and tumble of Usenet debate, and had I not been feeling poorly and a little down, they would not have ordinarily affected me. Even so, this was not entirely the reason I jumped the sbe ship.
More seriously was the way in which biologists (and some non-biologists, including computer programmers, who tend to think they invented all that is worthwhile in thinking) treat philosophical debate.
A large number of scientists think that because they know their science (which, indeed, they ought) they are entitled to make any philosophical claim they like in respect of it. Likewise, engineers (of which computer programmers are a subspecies, as Dilbert reminds us) tend to think that because words or ideas are used by their own profession in one way, they must mean the same thing in all places, professions and times. And that this should settle it.
I was trying to argue that DNA is not digital, in the purest mathematical sense. I appealed to the idea of a Turing machine - a physically impossible device that shows that all decidable problems are computable. A careful comparison of a Turing machine and a physical computer like the Macintosh on which I type this blog, shows that at best, my Mac is just an approximation of a digital device. For my Mac can suffer from the slings and arrows of thermodynamics, while a Turing machine cannot.
Why does it matter? Well it is this: we tend, in the western tradition, to overgeneralise from how we represent things to the ways things are. This is the subject of many fallacies in logic, such as fallacies of composition, but the generic problem is one that goes by the name of the "microcosm-macrocosm", and which can be traced back to Plato. Robert Fludd, in the 17th century presented analogies between human organs and the celestial realm, following Plato, but this is only the most egregious instance. The general problem is that of anthropomorphising nature - making it like us.
A school of thought known as Developmental Systems Theory, of which Paul Griffiths is the most prominent author, has arisen to challenge the idea that genes, in particular DNA molecules, are "information", a view that Dawkins and G. C. Williams have championed. On this account, DNA molecules are another kind of developmental resource. The "information" metaphor is all in our heads. For those interested, the book that started all this is Susan Oyama's Ontogeny of Information. Reifying information is a form of anthropomorphism.
This came home to me when a member of my department, Drew Berry, was trying to model in 3D and real-time the replication of the DNA molecule, examples of which can be found here, and which appears as the animations in the TV series DNA. Drew remarked to me that this was a gross simplification, as mismatches occurred when the "wrong" molecule was made instead of the templated one, and also that he had to leave out all methylation - molecules that bind to the outer aspect of the helix and which often regulate the expression of the genes. Also, methylation is used by the "repair mechanisms" to determine which strand is to be used to make the "correction".
Now I began to wonder - if DNA is really information, that is to say, digital, why all this messy analog stuff? It hit me that DNA is information in the same sense that my Mac is a Turing machine - it approximates it, in the analog world. In short, the information is in our representations of genes as sequences of symbols "G", "T", "C", and "A". The molecules are not symbols - they do whatever the thermodynamics permit. Thinking of genes as information leads to error - the idea that, in order to evolve, we have to have replicators that are digital, as Dawkins argued in his book River out of Eden. I think this is unnecessary, and raises a serious problem of how replicators could arise - if they couldn't evolve because they are the sine qua non of evolution, then they had to arise by chance, and I think this is massively unrealistic.
So this is a philosophical dispute. Underlying it are all kinds of type-token and symbol-sign-referent distinctions that any philosopher would immediately see is important (and indeed would have since the work of Peirce and Jevons in the nineteenth century). To fail to make such distinctions is to invite a world of confusion. And many biologists do just this.
When I was treated like a little child, unable to grasp what even computer programmers could intuit, I took offense - too quickly to be sure, but this is a symptom of a larger problem - the arrogance of many (in particular, molecular) biologists. Scientists too often assume that before their kind arose, humanity was stupid. My own reading suggests that even when we were wrong, we were intelligent about it, at our best. I have a renewed respect for Aristotle, for instance, in tracking down how he applied his own notion of "genus" (genos) and "species" (eidos) to living things. The usual approach by modern biologists is that he got it all wrong. He was much better than the myth suggests.
Thomas Kuhn was often overstated in his work on science, but one thing he nailed exactly - "textbook history". This is the way modern scientists treat the past, as a precursor to the way they do things now. The way scientists treat philosophy is often equally parochial. Excuse me if I get annoyed from time to time...
The debate was getting a bit personal, and I was accused of being "pig-headed", "needing to go back to secondary school" and so forth, but such things are part of the rough and tumble of Usenet debate, and had I not been feeling poorly and a little down, they would not have ordinarily affected me. Even so, this was not entirely the reason I jumped the sbe ship.
More seriously was the way in which biologists (and some non-biologists, including computer programmers, who tend to think they invented all that is worthwhile in thinking) treat philosophical debate.
A large number of scientists think that because they know their science (which, indeed, they ought) they are entitled to make any philosophical claim they like in respect of it. Likewise, engineers (of which computer programmers are a subspecies, as Dilbert reminds us) tend to think that because words or ideas are used by their own profession in one way, they must mean the same thing in all places, professions and times. And that this should settle it.
I was trying to argue that DNA is not digital, in the purest mathematical sense. I appealed to the idea of a Turing machine - a physically impossible device that shows that all decidable problems are computable. A careful comparison of a Turing machine and a physical computer like the Macintosh on which I type this blog, shows that at best, my Mac is just an approximation of a digital device. For my Mac can suffer from the slings and arrows of thermodynamics, while a Turing machine cannot.
Why does it matter? Well it is this: we tend, in the western tradition, to overgeneralise from how we represent things to the ways things are. This is the subject of many fallacies in logic, such as fallacies of composition, but the generic problem is one that goes by the name of the "microcosm-macrocosm", and which can be traced back to Plato. Robert Fludd, in the 17th century presented analogies between human organs and the celestial realm, following Plato, but this is only the most egregious instance. The general problem is that of anthropomorphising nature - making it like us.
A school of thought known as Developmental Systems Theory, of which Paul Griffiths is the most prominent author, has arisen to challenge the idea that genes, in particular DNA molecules, are "information", a view that Dawkins and G. C. Williams have championed. On this account, DNA molecules are another kind of developmental resource. The "information" metaphor is all in our heads. For those interested, the book that started all this is Susan Oyama's Ontogeny of Information. Reifying information is a form of anthropomorphism.
This came home to me when a member of my department, Drew Berry, was trying to model in 3D and real-time the replication of the DNA molecule, examples of which can be found here, and which appears as the animations in the TV series DNA. Drew remarked to me that this was a gross simplification, as mismatches occurred when the "wrong" molecule was made instead of the templated one, and also that he had to leave out all methylation - molecules that bind to the outer aspect of the helix and which often regulate the expression of the genes. Also, methylation is used by the "repair mechanisms" to determine which strand is to be used to make the "correction".
Now I began to wonder - if DNA is really information, that is to say, digital, why all this messy analog stuff? It hit me that DNA is information in the same sense that my Mac is a Turing machine - it approximates it, in the analog world. In short, the information is in our representations of genes as sequences of symbols "G", "T", "C", and "A". The molecules are not symbols - they do whatever the thermodynamics permit. Thinking of genes as information leads to error - the idea that, in order to evolve, we have to have replicators that are digital, as Dawkins argued in his book River out of Eden. I think this is unnecessary, and raises a serious problem of how replicators could arise - if they couldn't evolve because they are the sine qua non of evolution, then they had to arise by chance, and I think this is massively unrealistic.
So this is a philosophical dispute. Underlying it are all kinds of type-token and symbol-sign-referent distinctions that any philosopher would immediately see is important (and indeed would have since the work of Peirce and Jevons in the nineteenth century). To fail to make such distinctions is to invite a world of confusion. And many biologists do just this.
When I was treated like a little child, unable to grasp what even computer programmers could intuit, I took offense - too quickly to be sure, but this is a symptom of a larger problem - the arrogance of many (in particular, molecular) biologists. Scientists too often assume that before their kind arose, humanity was stupid. My own reading suggests that even when we were wrong, we were intelligent about it, at our best. I have a renewed respect for Aristotle, for instance, in tracking down how he applied his own notion of "genus" (genos) and "species" (eidos) to living things. The usual approach by modern biologists is that he got it all wrong. He was much better than the myth suggests.
Thomas Kuhn was often overstated in his work on science, but one thing he nailed exactly - "textbook history". This is the way modern scientists treat the past, as a precursor to the way they do things now. The way scientists treat philosophy is often equally parochial. Excuse me if I get annoyed from time to time...