Entropy
Entropy in physics and information theory refers to the measure of disorder or randomness within a system. It signifies the natural tendency of systems to move from order to chaos over time. In a broader philosophical sense, entropy can symbolize unpredictability, fragmentation, and the breakdown of structured patterns.