Mixture of experts is a machine learning technique where multiple expert networks are used to divide a problem space into homogeneous regions. MoE represents a form of ensemble learning. They were also called committee machines. From Wikipedia
The new models, leveraging multimodal capabilities and mixture-of-experts architecture, are now available, while the larger Behemoth model remains in training.