![]() SO THEN HOW CAN ENTROPY BE A STATE FUNCTION, IF IT FOLLOWS CHAOS THEORY? It's always accounted for in some way, somehow. Less microstates available to the system = smaller entropy for the system.Īdditionally, the universe has now increased in entropy because the number of systems considered has doubled (you + balls). That is, you cannot predict exactly how they will fall.Įven if you made them stick to each other, the balls system decreased in entropy simply from falling and becoming a system separate from the human system, and the human system has decreased in entropy when the balls left his/her hands. It is entropically favorable for them to separate from each other and scatter upon hitting the ground. If you drop a bunch of non-sticky balls on the ground, you cannot guarantee that they will stay together AND fall onto the same exact spot each time, AND stay in place after falling. This implies that the universe is a chaotic system. (It is NOT the second law of thermodynamics.) That makes this chaotic system unpredictable you expect #5.2385947493857347xx10^(-16)#, but you probably won't get that in a million tries.Įssentially, the basic tenents of chaos theory that relate to entropy is the idea that the system leans towards "disorder", i.e. We do not need to get into the definition of what makes a chaotic system, because that is way outside the scope of the question.Īn example of a chaotic system is when you work with numbers in computer programming that are near machine precision (just borderline too small, basically) they will be extremely difficult to keep entirely unchanged, even if you are just trying to print out a specific small number (say, near #10^(-16)# on a 64-bit Linux). Entropy, however, is a path independent function.Ĭhaos theory basically states that a system where no randomness is involved in generating future states in the system can still be unpredictable. ![]() The #del# implies that heat flow is not a state function (path independent), but a path(-dependent) function. #color(blue)(DeltaS = int1/T delq_"rev")# ![]() It's not exactly a good definition per se, but that's how it's generally defined. Entropy is in general a measure of "disorder".
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |