Having hopefully dealt with ImagingGeeks dismisal of QM I will outline the problem modern QM is throwing up.
Usually one expects modern QM to resolve by the correspondance principle back to the classic physics or at least old QM physics.
When ImagingGeek started this thread I noticed strangely that the issue of life entropy does not and it appears I am not the first as you will find in the calculation discussion.
The problem was originally set out by Erwin Schrödinger in his book "What is life" and its worth a quick read for background (http://en.wikipedia.org/wiki/What_Is_Life
The bottom line of the discussion was Schrödinger's "paradox" which simply stated says life does not defy the second law of thermodynamics because an organism creates disorder outside itself (chemicals, heat etc) that more than makes up for the order of the organism.
ImagingGeek is basically arguing that same argument via Gibbs Free Energy and it's equations and that this holds together in classic physics and old QM. He then goes beyond that and tries to extinguish QM just because we reach the macro world which is crazy and I have dealt with that issue hopefully.
Let me state the argument from a modern QM perspective:
If you specify a precise physical system and then specifying the exact (mixed) state whose probability distribution is exp(-BE(p,q)) classically or whose density matrix is proportional to exp(-BH) under QM. These formulae only work for systems with many degrees of freedom that interact with each other. They're either in equilibrium or not. If they are, the formulae are applicable then the equilibrium state at a given conditions is essentially unique.
I was thinking I was going to have to write a rather long mathematical explaination and was discussing with colleagues and a friend of a friend suggested a link doing exactly what I was about to dohttp://johncarlosbaez.wordpress.com/2012/06/07/information-geometry-part-11/
I not only agree with the mathematics but also the conclusions
But the basic idea is compelling: an evolutionarily stable state is like a situation where our replicators ‘know all there is to know’ about the environment and each other. In any other state, the population has ‘something left to learn’—and the amount left to learn is the relative information we’ve been talking about! But as time goes on, the information still left to learn decreases!
Note: in the real world, nature has never found an evolutionarily stable state… except sometimes approximately, on sufficiently short time scales, in sufficiently small regions. So we are still talking about an idealization of reality! But that’s okay, as long as we know it.
The first statement sets up a typical QM statement and is somewhat profound if you think about it.
- Organisms will always get more and more complex or else they run out of "things to learn" and come into quantum equilibrium.
- The equations assume that organisms never forget information. Now I am not a biologist and I am not sure if that is true it occurs to me a gene around thousands of years ago may have been lost and you would need to adjust the calculations slightly if true.
- It is also setting up the slightly touchy QM question does this imply that organisms know "all the things to learn" or is this simply a description of some slightly deeper truth.
I would also like to say "things to learn" is a better survival description from what I said and here is my exact quote
This was what I was getting at a complex organism has more things to attack and normal science entropy logic says it should be weaker BUT IT ISN'T that means something is important going on here .. so you missed my point.
To which ImagingGeek answered
There is no other answer to this other than "you are wrong". The degree of selection we see is a direct measure of the degree of selective forces experienced by the organism.
and he goes on to say
As a "rule" (because there are exceptions), bacteria are much more hardy than us, and can survive much wider swings in environmental conditions. And yes, they have compensatory mechanisms just like ours that allow them to do that.
There is perhaps some confusion in the term of complexity means and I certainly understand what ImagingGeek means when he states bacteria are more hardy than us for enviromental conditions but from a QM perspective they have a lot left to learn compared to us and I stand by my original comment and it is important and the QM mathematics backs it up.
I am working with a few peeps on why Gibbs Free Energy falls apart under modern QM, I suspect this was always sort of known because of reference to the Gibb's Paradox in classic physics (http://en.wikipedia.org/wiki/Gibbs_paradox
) and I note there is an argument going on on wikipedia contributors about it (http://en.wikipedia.org/wiki/Talk%3AGibbs_paradox) and they have called for physicists
I should say also Lubos Motl also has done a great article on mixed states and entropy where he was beating up Erik Verlinde over basically the same issue.http://motls.blogspot.com.au/2010/02/entropy-information-and-mixed-states.html
The whole notion of entropy was designed, and is still critically useful, for understanding of the irreversibility in the world because the increasing character of the entropy is its basic property.
I should say topically this article appeared today in physics.org which is sort of interesting if not directly related to discussion but doing the same sorts of thing using maths and QM to look at the problem of lifehttp://phys.org/news/2013-03-math-reveals-insight-life-born.html