Thursday, March 5, 2009

Best Philosopher in the Last 200 Years

The Leiter Reports is doing a poll on who's been the best Philosopher in the last 200 years; it is quite silly on many grounds -especially since John Rawls is on the short list (not that he's not good fun to read)- but I thought it might be fun. Here it is.

Sunday, March 1, 2009

Emerge this!

The Pitt HPS Club will be discussing the super-general, related-to-everything phenomenon of EMERGENCE for our next meeting. Below is a sexy podcast by WNYC's Radiolab, which will offer a good jumping off point for discussion.

Feel free to post desired discussion topics below.

Monday, February 23, 2009

The Journal of Philosophical Studies

The Journal of Philosophical Studies new issue focuses on the Philosophy of Science: "Models, Methods, and Evidence: Topics in the Philosophy of Science / Proceedings of the 38th Oberlin Colloquium in Philosophy." For all of you who are interested (a.k.a the insane) we all have access to the journal through PittCat. But really, the Godfrey-Smith Article on "Models and Fiction in Science" is quite interesting.

Wednesday, February 18, 2009

Entropy and You




If two thermodynamic systems are each in thermal equilibrium with a third, then they are in thermal equilibrium with each other


Energy can neither be created nor destroyed. It can only change forms. In any process, the total energy of the universe remains the same. For a thermodynamic cycle the net heat supplied to the system equals the net work done by the system.


The entropy of an isolated system not in equilibrium will tend to increase over time, approaching a maximum value at equilibrium.


As temperature approaches absolute zero, the entropy of a system approaches a constant minimum.

General Definitions:

Entropy is a measure of the disorder of a system. I can also be thought of the amount of energy that becomes unavailable after the energy of a system undergoes transformation.

A Closed System is one in which no outside material/energy/information are added to your subject space and no inside material/energy/information are subtracted from your subject space.

Work is = to Force * Distance, as measured in Joules, which are the ppreferred unit of energy; unlike homework…

Absolute Zero 0 Kelvin1 is = to -273.15 degrees Celsius or -459.67 degrees Fahrenheit.

I’m going to examine Thermodynamics quickly for your all: First I want to give you some of the history of the development of the thought concerning heat. I hope that will give you a kind of index so that you can individually follow your own interests with respect to this wide history. Please note that I may use terms here very loosely and without providing their physical definitions as we now understand them: terms such as Heat, Kinetic Energy, Caloric, etc., should be taken as pragmatic descriptions; they are used instead of the more correct, but anachronistic for that time, terms we now use. Then I’m going to provide an overview of the physical ideas which were generated by this history in which I will give more precise definitions of the relevant terms. I’m going to focus mostly on the First and Second Laws while spending almost no time on the Third, which is more technical than conceptual. It is my hope that this section will help you understand the basic conceptual vocabulary of the subject space in question. And finally I’m gong to highlight some of what I find to be the interesting philosophical questions which are generated by the field. This section will deal with the arrow of time, which, I’m sure will be particularly interesting to you all, since we are going to be hearing a talk on it later in the week. Since this is an informal paper, forgive me for not citing sources.

Thermodynamic History:

Thermodynamics has a very long history and I’m not going to go back to ancient times, nor even, thought I was tempted, back much before 1600. As with lots of history of science, I am going to start, perhaps without much warrant, at Francis Bacon. Bacon wasn’t a Scientist himself, he was a lawyer and educational reformer, but he was interested in Heat. And interestingly enough in the New Organon (Novum Organum, book II) he does suggest that heat is related to motion. As with most of his empirical work, well, not much can be said for it. Yet his focus on method, along with others –notably Newton, Descartes, and Galileo- no doubt was influential in the study of heat.

From here I’m going to skip over very important people who deserve much more time. I’ll say only that Robert Hooke, in the 1660’s also thought that Heat was related to motion; and Gottfried Leibniz, in the late 1670’s, developed what he called “Vis Viva” and said that this quality was conserved. Newton generally hating everything about Leibniz, they had been arguing about who had invented calculus first for some time, Newton said that this was nonsense. Newton actually felt that the conservation of Vis Viva was in conflict with his idea of the conservation of momentum. Vis Viva, which we might call kinetic energy, was eventually unified with the conservation of momentum in the idea of the conservation of energy as the first law of Thermodynamics. But I’m getting ahead of myself.

Let me take some time to place this in a more historical perspective: In the 16th c. there were competing theories of what heat was; was it a thing, namely Caloric, or was it a property of objects having something to do with motion. The former theory was proposed by Lavoisier and had great explanatory power and success. These were said to come to a head with the experiments of Benjamin Thomason. A rather odd character, Benjamin Thomason was a Massachusetts born scientist (1753) who left America during the American Revolution for England because he was a loyalist. Once there he remarried and became Count Rumford. According to this supposed ‘crucial experiment’ Rumford showed that you could heat something indefinitely and concluded that an object could not have an infinite amount of Caloric. Let us look at his experiment: He used a boring post which would turn against a substance, in this case metal, and heat it up to boil water: measuring frictional heat. He prolonged this experiment, done in 1798 –there or around-, and showed that the metal never lost its ability to produce heat. This seemed to him proof positive that the Caloric theory was wrong but it is perhaps worth noting that many respected scientist did not agree with him: (Pierre-Simon Laplace maintained the Caloric theory, as did John Dalton, the founder of the modern theory of the atom, who spent a lot of time in his “A New System of Chemistry” defending the Caloric.)

Joseph Fourier submitted a paper by the title of “The Analytic Theory of Heat” in 1822 which gave only a kinematic, that is, descriptive, way to measure and understand heat. He basically said that he wasn’t interested in what Heat was, ontologically, but only in how it worked. This was very influential but, I would say, psychologically unsatisfying. After all, if Science wasn’t telling us what things were, if it wasn’t able to provide us with a picture of the world, well, what was it doing?

From here we should move on to Sadi Carnot who did interesting work concerning Heat Engines: (I’m going to go into more depth as to what a Heat Engine is in the section on the Physics of Thermodynamic, but for now think of it as a car, or a refrigerator, or, more appropriately for Carnot’s time, a steam engine.) Carnot, in his 1824 “Reflections of the Motive Power of Heat” argued for a law of conservation of caloric (we see that this theory just won’t go away) and gave a quantitative relationship of work/caloric to heat. That is to say that he defined what Work was as the relationship between fuel and mechanical output. The work of Carnot, especially the Carnot Cycle, is what later was formulated into the Second law of Thermodynamics.

From the work of Carnot, Rudolf Clausius, in his 1850 “On the Mechanical theory of Heat”, was able to reframe heat through the atomic theory. Though this he was able to take the First Law, what had been kicked around for some time, and link it to what he truly formed as the Second Law. He can, I think, be said to be the person who formulated the modern field of “thermodynamics.” And from his work, which showed that Entropy always increased, Lord Kelvin was able to show that this meant Time was irreversible; this Arrow of Time comes out of the fact that energy in a system always becomes unavailable; or as it is sometimes said: Entropy always increases in a closed system. At this time, this implied what was then called the Heat Death of the universe.

I’m going to end this spotty history here. I’ve butchered the history of science enough through my brevity and going forward from 1850 would be almost impossible while still maintaining sensibility without also taking a more substantive approach. Needless to say after Clausius many scientists began to be convinced that the Heat of an object was a form of Energy, specifically Kinetic Energy, which was the motion of atoms. It wasn’t until later that the Atomic Theory was widely accepted, but that is another, though closely related, story. The next steps in this story are the ones to Atomic Theory and the Statistical Mechanical world view. But at least by 1850 we have developed the two most important concepts for us: that is, the First and Second Laws.

The Physics:

Thermodynamics is not a kinematic description of the world post atomic theory. You can give a purely kinematic account of thermodynamics, that is, you can make measurements which track these laws without ever talking about atoms, but the atomic hypothesis is very helpful in understand the Why questions inspired by the study of heat. In statistical-thermodynamics we don’t use a purely reductive method of examination – that is, we don’t look at what individual atoms are doing- (this atom bumped into that atom which caused that atom to speed up and so it has a given energy level (heat)) no, what we do is think about averages. A good analogy is sociological studies of, say, suicide – the kind done by Emile Durkheim at the turn of the century. You cannot really tell if any one individual is going to kill themselves; but you can say, with a high degree of accuracy, how many individuals in a social system are going to kill themselves. In this same way you cannot really say what one atom is doing, but take a lot of them and you can say with a high degree of certainty what they are all doing.

First, the 0 and 1st laws are about energy and energy flow, where as the 2nd and 3rd are concerned with entropy.

The 0th law was developed after the other three, but it was realized that it was somewhat more fundamental. Philosophers will enjoy this law because it is totally definitional: that is, it defines what we mean when we say something has Temperature.

Temperature becomes meaningful with 0th law:

The 0th law defines what Thermo Equilibrium means: If A is in equilibrium with B, and B is in equilibrium with C, then A and C are in equilibrium with each other. Remember, this is not “=” as in math but rather as in the tipple bar of a logical equivalence. Put in standard terms “if two thermodynamic systems are each in thermal equilibrium with a third, then they are in thermal equilibrium with each other.” So, if we define A as having a temperature of 100 degrees Celsius, and we place a tube with mercury on that thing and wait until they come into equilibrium with each other (in accord with the 1st law) – we mark how high that mercury gets on the tube… then we can test other things relative to the height of mercury in that tube at 100 degrees Celsius. Again, temperature is a property of objects here, and not a thing itself. The definition of what temperature is, that is the 100 degrees part, is artificial. This basically allows us to create measuring sticks for energy and provides meaning to our terms through their relations to other objects. This is all kinematic, but with the atomic theory we can get a dynamic picture/interpretation of these results: namely, that this property of objects in the average motion (that is, kinetic energy) of the atoms of the system.

1st law: Conservation applies to Energy:

Terms first: Temperature, Thermal Energy, Heat:

Temperature, as we have seen, is the average kinetic energy of an object.

Thermal Energy is not the average energy, like Temperature, but the total energy of a system. I’d go into this further but the distinction isn’t really all that interesting… just think of this as bounding the system to be described as everything which your object effects. Other than that, it can be thought of, in short hand, as the temperature of everything involved in a given thermodynamic system.

Heat should be used only as a verb, because if you use heat as a noun you are thinking about it as a substance and not as a property of a substance. So, you can heat something but something cannot have heat: (the latter would be fine to say if you subscribed to the caloric theory of heat.) Heat is the flow of thermal energy from one object to another. So you start out with the total kinetic energy of a system and can move it around in various ways, but you must end up with the same total energy in the end: this is the conservation of energy and the first law of Thermodynamics.

Conservation rules go back a long way in philosophy of science –to the 12th century English priest Adelard of Bath and his “Natural Questions”; basically conservation means you cannot appeal to outside, extraphysical/metaphysical/theological, forces as causes or explanations of phenomena: meaning that in natural philosophy all phenomenal are ontologically and epistemologically closed. In any case, the conservation of Energy here means that the Thermal Energy of a closed stem will not increase or decrease, but can be transformed from one form of energy to another: (this begs the question “what is energy” and I’ll talk a little bit about that in the section of philosophical question.) So again, Energy can neither be created nor destroyed, it can only change forms. In any process, the total energy of the universe remains the same. For a thermodynamic cycle the net heat supplied to the system equals the net Work2 done by the system. From this you can see that you can measure the change in Thermal Energy by the addition of Work & Heating. An example: you can heat water by mechanical work (that is, you can move it around vigorously) and/or by heating it off your stove: say you have a hot stove which you turn off, since we what to work with closed systems, and put your pot of water on it; the two objects will come into thermal equilibrium with each other. We can see that this first law basically means we can do good bookkeeping of Thermal Energy. It hints at a good explanation of why your coffee goes to room temperature after a little time, though this is actually the fact of the Second Law.

2nd Law: Efficiency, Waste, and Disorder.

I think the best way into the second law is through examples. But first, I want to make sure that I clarify the idea of Entropy a little more – though it is sometimes a difficult concept, and may become clearer after the examples are worked through. Entropy is a measure of the disorder of a system, as I said. But that begs the question “what is meant by disorder?” Disorder means unpredictability, at least in one sense. Let’s take a simple and often given example: a deck of cards. If you open a box of cards it is ordered Ace through King and all in kinds (that is divided into suits). If you do nothing to this closed system the disorder does not increase, it stays the same. But if you shuffle the deck they will be less ordered. This is because there are any number of ways the 52 cards can be arraigned with respect to each other: in other words, there are any number of states that your cards can be in. Now, it will never be the case that you take a well shuffled deck of cards, shuffle them and find that they are arraigned Ace through King and all in their kinds. This last shows the idea that the Entropy of a system increases. Now, you might say that it is not impossible that my cards spontaneously become ordered, just highly unlikely; and you’d be right. This is one of the most interesting concepts of the Statistical Mechanical world view and it is what I will call “Stochastic Laws” in the section on interesting philosophical ideas: however, the physicist would answer this hypothetical a little differently. They would say that when you are dealing with atoms, you are dealing with such a large number of things, remember Avogadro’s number, that the probabilities are so extreme as to be practically impossible.

We can also think of this law, more simply and concretely, by saying that two things in contact will come into equilibrium with each other. Now, this does not haft to be the case… that is, it is not logically necessary according to the conservation of energy. A hot object touching a cold object could get hotter while the cold object got colder and still preserver the conservation of energy: the Fact that this doesn’t happen is the second law. This might seem obvious, but it is helpful to define the obvious because it often, as in this case, leads to unexpected consequences. There are consequences of this with respect to efficiency. Efficiency is the ration of the cold temperature, the cold bath, to the hot temperature, the hot bath. Let’s go back to the heat engine: if you start with 100c water which is pushing some kind of piston and then that steam must dump down to room temperature again –or else you don’t have a system and the engine would stop, because all the water has turned to steam and can’t really be resteamed to make more work- you can only get about 20% efficiency from this process (according to Carnot’s math). That means for every useful Joule you get, you’re giving up about 3. (You can increase the efficiency but you’d have to make things hotter than you really can for practical purposes.)

Now we should take Carnot’s heat engine and work through a kind of real world example. A heat engine is a device that moves heat from one place to another. This means that it has to have what is called a Hot Bath and a Cold Bath. The Cold Bath on your car would be the tail pipe. The Cold Bath on your refrigerator would be your kitchen (or really, the coils on the back of the box which heat up when the frig is on and operating and thereby dumping heat into your colder kitchen). So, the question that comes out of a heat engine is efficiency: how much you get out of what you put it: how much energy is useful and how much is lost: (interestingly, efficiency is dependent upon the use we put the fuel to and not some intrinsic quality of Energy. So, if we think that the energy which cannot be used mechanically is still useful, well, then our heat engine becomes more efficient.)

Ok, so if you put in 100 Joules of fuel into your car, you might get, totally hypothetically, 10 joules of mechanical work from that fuel. But where does the rest of the energy go? It cannot just disappear because of the restrictions of the first Law. As I’m sure you know this energy is translated in to the heating of your car and the cycle of cooling that involves your tailpipe. So, if you have a car driving down a highway it is very orderly –all the atoms are going in the same general direction- but eventually that car might, let’s say, crash. Now that energy is conserved but it is all over the place… it is not very usable… it is in a much more entropic state. What can’t happen, because of all that I’ve said above, (because this energy is less useful), is that the crashed car spontaneously gets itself back together and start on down the highway again. This would be probabilistically impossible in exactly the same way that my card shuffling example was probabilistically impossible. And this fact is what we call the arrow of time.

So, to recap, the 2nd Law of Thermodynamics is the fact that two bodies come into equilibrium with each other. From this fact we get Entropy, which is a quantifiable measure of Energy which becomes unavailable when two objects not in equilibrium approach an equilibrium value. Often this is said as “The entropy of an isolated system not in equilibrium will tend to increase over time, approaching a maximum value at equilibrium.”

3rd Law:

I’m going to skip this without doing much damage to our conceptual apparatus. In brief, the fact that entropy is temperature dependent generated the idea of absolute zero (as I defined in the out set).

Philosophical Questions:

Heraclitian versus Parmenidean metaphysics

This is to my mind the most interesting philosophical question posed by Thermodynamics. Heraclitian metaphysics are process conceptions, where as Parmenidean are substance conceptions. Parmenideas proposed the idea that the way to understand the world was through the idea of unchanging stuff (atoms). He said that in order to understand the world the natural philosopher much examine closely this unchanging stuff for its universal, necessary, and certain characteristics. Heraclitus on the other hand said that everyone recognizes experience to be particular, contingent, and positive of some uncertainty. He, therefore, said that in order to understand the world you should look at the Logoi (plural of Logos meaning rules or laws) for change. With the advent of the Calculus we were able to model change in a much more sophisticated way and as the Statistical Mechanical world view became more prominent we start seeing that the world has qualities which are not exclusively Parmenidean; that is, knowledge of the world cannot be said to be universal, necessary, or certain.

This I also called the idea of “stochastic law”, stochastic being a 75 cent word for statistical. It is quite odd, on its face, to say that a Law is only probably true. And of course this is what we do say with respect to Thermodynamics as well as many other areas of modern physics.

I’d rather not say that the Heraclitian and Parmenidean metaphysics are in conflict with each other, but rather complement each other. But I think if modern physicists had to give up either the idea of the ultimate reality Matter or the Idea of the ultimate reality Energy, they would en mass give up the idea of the reality of Matter: (that is, side with a process rather than with a stuff.)


The rise of the science of thermodynamics brings the idea of energy into the center stage. But energy is only real in specific forms and no specific form of energy is really Energy. That is to say that Energy has a generic character, but it only manifests itself as Heat E, Gravitational E, Chemical Binding E, ElectroMagnetic E, Etc. So there are rules, logoi, to the transformation of energy, because if there weren’t you couldn’t talk about reality, but the ‘thing’ Energy does not itself exist: it is ontologically vacuous. So we believe that each specific form of Energy is an instance of a more generic concept which is not itself observable or necessarily real. Another way to put this is that the generic form of energy is what is conserved, and what the first law applies to: the specific forms can be mutilated however you like.

This really shows the gulf between atomistic/Parmenidean THINGHOOD thinking and process/Heraclitian metaphysics of dynamic change. Energy is immaterial, but it has explanatory characteristics of mater: it has properties: it can act upon matter and change it without being material itself.

The Challenge from Creationism

Interestingly the idea that Entropy always increases has provided a tool for antievolutionists the world over. If you VideoGoogle the second law you get more, so called, proofs of Gods existence than you get explanations of the law. Creationists claim that that fact that Entropy increases means that self organizing systems are impossible without an appeal to a supra-natural force. I want to make a point of saying that objections to self organizing systems based upon the second law are largely misguided. First it disregards lots of science which shows us that structure, process, and relations that have causal consequences in the physical world. Second, if someone wants to understand self organizing systems it might behoove them to look at the theory of self organizing systems –variously called systems theory, chaos theory, or complexity theory-. When we think of a system we are talking about a set of objects which are mutually adapted to one another in relation to the functioning of the whole assembly of parts. So there is a top down character to systems theory which there isn’t in the atomistic reductive mode and this top down character, creationists claim, must come from God. Let’s walk this out further: the system has properties that are not possessed by its parts. For example, Sodiumcloride has characteristics that sodium and clorine do not have and which could never be predicted from its base components: –Sodium Chloride is yummy salt, but sodium alone is a metal which is explosive if it comes in contact with water, and chlorine is a toxic gas.

The appeal to God here is deeply flawed for various reasons; I’m going to point out only two:

First, this form of argument has been known, at least since Plato, as what is called the Argument from Ignorance. That is, if you think that evolution cannot explain self organization in light of entropy that represents a problem for evolution and not support for the God hypothesis: (If X then Y/ Not Y,/ therefore Z is the logical form of this fallacy.) It is also not at all clear that Evolutionists think this is a significant anomaly (to use Kuhn’s words) or is not (will not be) explainable to in terms of Systems Theory.

Second, appealing to a supra-natural force violates the rules of epistemological and ontological conservation first described by Adelard of Bath oh so many years ago. Disregarding this is to not be playing the game of Science. If you want to play the game of Science you have to follow the rules (as with any game).

The Arrow of Time

Time is the Dimention in which True Novelty Occurs

There are generally considered to be three arrows of time: The psychological, the Thermodynamical, and the Cosmological.

The Thermodynamic Arrow, I think, was well illustrated by my example of a car crashing. The basic idea is that since entropy increases, the equations in classical physics are no longer reversible. This is actually why the Statistical Mechanical model was developed by Boltzman, and others; because with the statistical interpretation physics is still in principle reversible, it is only practically impossible: (that is, it is possible for hot object A to increase in temperature when in contact with cold object B as B gets colder – it is only extremely unlikely.)

The Psychological Arrow is the fact that we see and think things have to go forward. A common explanation for this arrow is that it is dependent upon the Thermodynamic Arrow. This I think comes from the orthodoxy in Science and Philosophy which assumes that there is either still a mind/body problem or that there are only bodies. Both I think are rather curious since Newton destroyed the idea of bodies and left only minds, but that is for another time.

The Cosmological Arrow has to do with the inflation of the universe. This is model comes from the measurement of the curvature of the universe and is quite interesting. I’ll say only this little bit: The universe is getting bigger. Only only that but it is getting bigger because the space in between space is getting bigger: i.e., it is not only that atoms are moving apart from each other, it is that the essential space in between particles is expanding.

Consequences, Questions, and Forward Thinking:

Physics start picking up philosophical steam around the turn of the 20th century. Then Einstein blows everyone’s heads away by proving Newton was not quite so right, time and space are the same thing, and that rate at which time passes is relative. Everyone reels, and tries to make sense of what just happened. I think to a large extent this is still going on today. The real head trip for everyone was that over the 100 years after 1905 (Einstein’s special relativity) evidence has trickled, and then poured, in that almost all of Einstein’s major theorems were correct: Time passes slower at higher speeds; gravity bends light rays (and thus space itself); atomic bombs turn matter into huge amounts of energy. Furthermore, E=mc^2 links up with the 2nd law of thermodynamics to say that, if we know anything about physics, everything that we have ever seen or can ever interact with has existed since anything has existed, because mass and energy cannot just be created in a closed system. This gives obvious problems for the beginning of the universe, for all of this stuff had to come from somewhere. It seems fashionable to defer this tough question and to simply say the laws of physics (including E=mc^2, 2nd law) break down at the beginning of the universe, but the fashionable is usually also the most problematic.

So here is the real question: “what makes the past different from the future? For one, we know things about the past, and we don’t know anything about the future. The difference between the past and the future is that the past has less entropy, less disorder, and the future has more. Since entropy can not decrease in the universe as a whole (because it’s the only true closed system) earlier times always have less entropy. And so one of the thorniest problems we have, what makes earlier time special from later time, is solved nicely: earlier = less entropy, later = more entropy.

Unfortunately, some one came along and wrecked the party. The only reason the 2nd law makes sense, the story goes, is because of the laws of probability. It is far more likely that, given lots of randomly moving things, they will spread out rather than condense on one point. And so the 2nd law is cast as an empirical truth, not a law at all, and is only true because the origin of the universe was a very low entropy state. But we’re actually talking about the origin of the universe after the laws of physics start to take effect. So statistically its way more likely the particles will spread out and increase entropy but it is not a physical necessity. So, given that our initial universe had low entropy, what does that tell us about other things? This is what the talk by Mr. Carroll is going to be about, from what the website seems to say.

His potential topics (and our definitions) for these “other things” are:

Inflation: The universe getting bigger, as I explained above.

Quantum Gravity: Quantum as in quanta, as in smallest measurable whole parts of a thing. This is as opposed to continuous and can also be thought of as discretized matter/energy. For a while people thought light was continuous, but it turns out to have quantum particles which we know as photons. This has been instrumental in the understanding of optical physics as well as Evolution (genetics needed a discrete way to transfer characteristics). People now are looking for quanta of gravity and I’m sure we’ll be hearing a lot more about this after the LHC at CERN turns on.

Multiverse: The universe is not the only one of its kind, or so it is now supposed. Ours is one of various terms to denote the idea that there is actually more than one closed system in existence (Cosmos, Cosmi, space-time continuum, etc.). Right now, the current wisdom is that there are a great number of Verses.

If you still want a more in depth but very friendly explanation of these things I recommend the podcast “Astronomy Cast.” You can find it in the Itunes store for free. The episodes on Einstein’s relativity, Dark Matter, “what is the universe expanding into?”, and the Big Bang might be helpful if you’re really interested in the physics.

-Drafted by J.J. and Jono