new MoE from ai2, EMO

· r/LocalLLaMA ·

AI2 releases EMO, a new Mixture of Experts model architecture, adding to the growing MoE landscape from major research institutions.

Categories: Model Releases, Research

Excerpt

r/LocalLLaMA · 105 points · 12 comments · i.redd.it

Discussions