Entropy | 熵 shāng

Nothing is lost, nothing is created, everything is transformed.
― Antoine Lavoisier (August 1743 – 8 May 1794)

Unlike before, we started the class today with a quote. This is because it is really difficult to talk about entropy, and we made many analogies (such as water flows from high to low, a mirror broken never, or almost never, returns to whole again) to bring our attention to how things work in daily life that we have taken for granted.

Some theory/hypothesis says that the universe started with Big Bang, a state with very low entropy. There are many states of high entropy than low entropy (imagine 10…000 to 1). So we will have to cycle through lots of high entropy states before it is low again. Well, we only have barely touched the topic. Whereas our true goal is to talk about the so-called decision tree model, which we will cover tomorrow.

To help you remember the word “entropy” and its meaning (as if we knew!), “en” comes from “energy”.  “tropy” means “transfom”, and comes from Latin.

Entropy is a measure of the number of possible ways energy can be distributed in a system.

By the way, Lavoisier 拉瓦锡 was a great chemist.

Leave a Reply

Your email address will not be published.