Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch
-
Updated
Jun 17, 2024 - Python
Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch
Code for Dynamic Convolutions: Exploiting Spatial Sparsity for Faster Inference (CVPR2020)
[NeurIPS 24] MoE Jetpack: From Dense Checkpoints to Adaptive Mixture of Experts for Vision Tasks
Google DeepMind: Mixture of Depths Unofficial Implementation.
Mixture of Experts from scratch
Add a description, image, and links to the conditional-computation topic page so that developers can more easily learn about it.
To associate your repository with the conditional-computation topic, visit your repo's landing page and select "manage topics."