For completeness, you should read those threads, but the summary is that when people tried to answer his question, e.g., by showing that point mutations increase the Shannon information of the genome, or pointing at the literature for gene duplication, Egnor said that wasn’t what he meant by “biologically meaningful information” and refused to provide a definition.
On the Mar. 26, 2007 episode of the ID the Future podcast, Casey Luskin interviewed Michael Egnor. They talked about these discussions. Egnor accused Darwinists of being angry and implied that they were unsure of the soundness of their own theory (start listening at 12:42, if you care).
Then (around 14:16), Egnor said
I, for example, if a Darwinist approaches me, and asks me politely about Intelligent design, I’m delighted to talk about it!
I took this as an invitation to ask him to clarify his remarks.
On Apr. 13, I sent him this message:
Dear Dr. Egnor,
A while back, I read about your attempts to get Darwinists to tell you how how much biologically meaningful information a darwinian process can generate.
Unfortunately, I’m not sure what you mean by biologically meaningful information. Could you please clarify what you mean by this?
He replied (all email reproduced with permission):
Thanks for the note. I asked Darwinists to define biological information, because Darwin’s theory hinges on it. Darwin asserted that all natural functional biological complexity (information) arose by non-teleological variation and natural selection. ID theory asserts that some natural functional biological complexity (information) arose by teleological variation and natural selection. By ‘teleological’ I mean a process that is most reasonably understood as the result of intelligent agency, analogous to human intelligent agency, with which we have ample experience.
These assertions are the whole issue in the ID/Darwin debate.
I think the best definition is Dembski’s CSI, but there remains a lot to understand. What appalled me is that Darwinists don’t even know how to measure the property on which their entire theory turns.
I can’t help them prove their theory. That’s their job. What kind of scientist asserts that his theory is a fact, and when you ask him for the data on which his theory turns, he demands that you tell him how to prove it?
Darwinism is a scandal.
Unfortunately, that didn’t answer the question, so the next day I wrote back:
Thank you for your answer, but I’m afraid I still don’t understand. For one thing, the people at Time and Pharyngula defending evolutionary theory gave examples such as gene duplication, but you said that wasn’t what you were asking for.
This leaves the question of what you are asking for. You say that it’s close to Dembski’s CSI, but unfortunately I’ve been unable to find out how to calculate CSI.
Perhaps an easier question is, if a process did increase (or decrease) biological information in the way that you ask, how would we know? What would we have to measure?
On Apr. 22, he replied:
No one knows how to measure biological information in a meaningful way. The current ways of measuring information (Shannon, KC, etc) are relevant to sending signals, and are not of much help in biology.
Gene duplication is not a source of significant new information. It obviously changes the way things work in the cell, to some extent, but it can only copy what’s there, and we’re asking how it got there to begin with.
Even though we can’t measure it (and serious investigators like Dembsky are trying to figure this out), we know biological information when we see it. The genetic code, molecular machines, seamless integration of physiology are all obviously the kind of biological information that we are trying to understand. The only source of such information (or functional complexity or whatever) that we know of in human experience is intelligent design. There are no ‘natural’ codes, aside from biology, which is the topic at issue.
Darwinists have a responsibility to show that undesigned mechanisms can produce sufficient biological information to account for living things. If they don’t even know how to measure it, how can they assert that random variation and natural selection can account for it, and why is the design inference ruled out?
(Again, emphasis added.)
I wrote back:
If I understand correctly, evolutionary biologists do not recognize biological information as a necessary, or even useful, concept. You, on the other hand, intuitively recognize biological information, but cannot quantify this information the way that Claude Shannon quantified the nebulous notion of “information”.
Since you are claiming that “How much biological information can be generated?” is a meaningful and important question, isn’t it then up to you to define what you mean? It looks as though you’re asking evolutionary biologists to formally quantify your intuition, which hardly seems fair.
Have I misunderstood something?
The origin of functional biological complexity (‘biological information’, or whatever) is obviously of central importance to biology and the Darwin/ID debate. You can’t make the problem go away by pretending that it’s not a problem. We ID folks have a straight forward explanation: like all complex functional ‘machine-like’- systems that we encounter, biological systems are best explained (at least in part) as having arisen from intelligent agency. This raises profound philosophical issues, which is the reason that Darwinists are avoiding it, even to the point of denying that it exists.
If you don’t think that there is anything that could be meaningfully be called ‘biological information’ in living things, then there’s not much that we can talk about. I have no patience for sophistry.
which I took to be the end of that conversation.
So there you have it, folks: Michael Egnor can’t define “biologically relevant information”, no one knows how to measure it, so obviously it’s up to “Darwinists” to do the hard work of formalizing his intuitive notions. But he’s delighted to talk about it if asked politely!