Reblogged by bcantrill ("Bryan Cantrill"):
rasmus91@fosstodon.org ("Rasmus Lindegaard") wrote:
Ollama is pretty cool in how easily it can be deployed, Mistral 7b is absolutely astounding.
Thank you to @bcantrill and Alan for having a talk with Simon Willison on #OxideAndFriends about LLMs that reawakened my interest in looking at this:
primarily the part about open source, open datasets, and being able to run it locally without a $1000 GPU