Mixture of experts is a machine learning technique where multiple expert networks are used to divide a problem space into homogeneous regions. MoE represents a form of ensemble learning.