Deliver to DESERTCART.PE
IFor best experience Get the App
Information, Entropy, Life And The Universe: What We Know And What We Do Not Know
F**R
A good dose of reality for the educated layman.
At last we have an honest popular science book filled with facts, with solid concrete ideas not wild speculation or the usual tomes filled with statements which over emphasise what science can do. Too many times have I seen what passes for popular science, sometimes written by respected scientists, whereby they attempt to glorify the achievements of science with highly dubious statements. It is this kind of hyperbole which has for ever turned me off just about every popular science books.This book by Ben-Naim is divided into 4 chapters. The first introduces the idea of information finally leading to Shannon's measure of information (SMI). This is followed by plenty of examples both from simple ideas of probability and 20 question games to various mixing processes. The second chapter then concentrates on thermodynamic entropy for various processes in isolated systems as well as discussions of the arrow of time in the second law and the common interpretation of entropy as disorder. These two chapters are then used to study how both information and entropy are related to the processes of life. This includes a study of the molecular structure of DNA, information storage in the brain as well as some discussion about so-called neg-entropy and feeding on information. The last chapter concentrates instead on the universe and studies how many scientists have speculated on the entropy of the universe as well as its SMI. These two things are very different from each other and it is clear that even great scientists mistake the two or equate them.This book is an attempt to clarify these issues as they remain an ambiguous and confused mess in the popular science literature. The conclusions are that thermodynamic entropy is only ever defined for an isolated system at equilibrium which clearly cannot be properly defined for either living beings or the universe. In addition, information, as it is studied using Shannon's information theory, overcomes some of these weaknesses and can still be defined for a system that is not in equilibrium nor isolated provided that a probability distribution is defined for it. These clarifications completely demystify the speculative statements made in other popular science literature. A good dose of reality for the educated layman.
L**D
Does the Second Law Apply to Life and the Universe?
This is indeed a welcome and long needed addition to the literature dealing with the connection between entropy and information theory. If nothing else, Ben-Naim's book serves as a cautionary statement on a bottle of medicine warning the avid reader not to swallow all that is fed him in the pseudo-scientific popular literature that has grown up around the words entropy and information. First coined by Rudolf Clausius almost two centuries ago, Ben-Naim, with surgical precision, separates Clausius, the great physicist, whose formulations of the first and second laws still stand today, from Clausius, the dramatizer, who enshrouded his laws in the clichés that the energy of the universe is constant whiles its entropy `strives' to a maximum. Ben-Naim concludes that it is meaningless to talk about and energy and entropy of a universe that is not defined thermodynamically. Tell me its volume, number of particles, temperature, and pressure, and I will tell you its energy and entropy. Otherwise, the problem is ill-posed. Moreover, if such a universe is isolated, as we believe it to be, why should entropy show a tendency to increase?To the list of showmen, Ben-Naim adds:-Peter Atkins', in his "Four Laws that Drive the Universe," exaggerates when he says that the second law accounts for the emergence of the intricately ordered forms of life. It certainly does not. In another unfulfilled promise, Atkins promises to show how chaos can run against Nature. Oddly enough, the journal bearing the same name judged Atkin's book as "going along way to ease the confusion about the subject."-Sean Carroll's asserts in his "From Eternity to Here" states that, according to classical relativity, there is no way to reconstruct information. In matter of fact, general relativity does not touch on information at all, and he is confusing what has become known incorrectly as black hole thermodynamics dealing with the hated singularities that Einstein at all costs tried to avoid by building bridges over them.-Jacob Bekenstein's constructs the second law of black holes on the pillars of Stephen Hawking's area theorem, which likens the statement that when two black holes collide the area of their event horizons is greater than the sum of their original areas to the increase in entropy when two bodies, of the same kind, but at different temperatures, come into contact. Sadly, the particular case where the entropy is extensive occurs when the temperatures are the same is not covered by Hawking's theorem, and neither is it by Bekenstein's formulation of the second law. In fact, equilibrium between two black holes can never be achieved, so what type of second law are we talking about? And if that is not enough, Ben-Naim queries why bother calculating the entropy of a black hole to begin with when so little is known about it.-Jacques Monod, who epitomizes that adage that an expert should remain in his field of expertise, writes in his "Chance and Necessity" that "[i]ndeed, it is legitimate to view the irreversibility of evolution as an expression of the Second Law in the biosphere." Ben-Naim faults the Nobel Laureate saying that the statement is not only false but it moreover "deepens the `mystery' and `incomprehensibility' associated with entropy and the Second Law." I totally agree.Finally, although one is intrigued by Aaron Katchalsky's statement that "[l]ife is a constant struggle against the tendency to produce entropy by irreversible processes", I totally agree with Ben-Naim that the entropy of a living system per se is undefinable, and even if it were, no one would be able to quantify how much entropy it produces.In short, Ben-Naim's message is to keep your sites down and seek answers to well-posed problems, while being true that entropy has a privileged role in having one foot in the macroscopic world, and the other foot in the microscopic one. As such, it is not only macroscopically measurable but it can be calculated microscopically from the permutation of balls into urns, as Ludwig Boltzmann so well appreciated. Ben-Naim would settle for an intermediary role where entropy determines the probability distribution governing everything from games of chance to macroscopically disordered systems; in short, any system containing a large number of identical, and random distributed elements where the outcome of an experiment is less than certain.
Trustpilot
1 month ago
1 week ago