bcantrill ("Bryan Cantrill") wrote:
LLMs are not a threat to humanity -- humanity is a threat to humanity. The concern should not be for LLMs in and of themselves, but rather the actions that they will induce in humans. This feels obvious?
bcantrill ("Bryan Cantrill") wrote:
LLMs are not a threat to humanity -- humanity is a threat to humanity. The concern should not be for LLMs in and of themselves, but rather the actions that they will induce in humans. This feels obvious?