Mastodon Feed: Post

Mastodon Feed

Reblogged by jsonstein@masto.deoan.org ("Jeff Sonstein"):

mpesce@arvr.social ("Mark Pesce") wrote:

Gemini 1.5 promises to be faster and more efficient thanks to a specialization technique called "mixture of experts," also known as MoE. Instead of running the entire model every time it receives a query, Gemini's MoE can use just the relevant parts...

http://windowscopilot.news/2024/02/22/heres-everything-you-need-to-know-about-gemini-1-5-googles-newly-updated-ai-model-that-hopes-to-challenge-openai/