Mastodon Feed: Post

Mastodon Feed

Boosted by cstanhope@social.coop ("Your friendly 'net denizen"):
hazelweakly@hachyderm.io ("Hazel Weakly") wrote:

Ben Recht (@beenwrekt.bsky.social‬ on bluesky) has a nice article on The Bitter Lesson essay, which is here:

https://www.argmin.net/p/the-negroni-variation

I actually don't like the conclusion in the original Bitter Lesson essay. I think it's an astute observation, but I also think that it's misinterpreted by most.

I think the bitter lesson is mostly stating what happens when you misunderstand entropy.

Entropy has very deep connections to information theory, and one interesting thing about entropy is that the more structured your information, the less entropy it has.

In other words: Structuring data removes more information than it creates. What is lost is the latent entropy that wasn't encoded

Many times, that reduction of entropy is desirable. Parsing "any data" would be nearly impossible generically, but parsing a data format can be codified in a library quite directly.

But if you're trying to *understand* data... You want the latent information.

If you're trying to identify an effective way to think about the information required to solve a problem, and you can find a good data structure for that, it can make the problem tractable

Rope data structures and text editing are good examples of that. Very specialized structure, a but generic use

However, if you're trying to encode information in a generic way, you can't constrain the data, or you'll lose more information than you gain.

In a world where we erroneously think we understand the human brain and the nature of knowledge, consistently failing to apply that knowledge... is bitter.

In other words: the only algorithms that scale universally for understanding complex systems work by exploiting entropy rather than by exploiting knowledge representations

That, to me, is the true bitter lesson: Acknowledging that humans and knowledge are complex in a way that resists codification