The First Law of Complexodynamics
Implementations
Papers
- https://scottaaronson.blog/?p=762
Notes
- Why does what is interesting increase, then decrease? Yet entropy just increases.
- Entropy roughly the amount of disorder in a system. 2nd law says increases or stays the same in time.
- Given the 2nd law, why is it there was less entropy, or more order earlier on?
- Disorder over time does not mean more interesting over time. There seems to be a sweet spot between order and disorder that is interesting to us humans. And the author just objectively puts as interesting in general.
- Define how sophisticated something is by Kolmogorov Complexity as entropy: The shortest computer programming that encodes a given string/state. To make it behave like increasing entropy can bound it or make probabilistic.
- We could have used Kolmogorov Complexity for the Complexity metric itself, but it also suffers from the problem that very random strings have maximal complexity. Despite being completely uninteresting.
- Bounding complexity by time complexity does result in simple being low sophistication and random being low while the middle being high.
- Gzip can act as approximation as compressability kolmogorov complexity.