By A Mystery Man Writer
Large-scale models are revolutionizing deep learning and AI research, driving major improvements in language understanding, generating creative texts, multi-lingual translation and many more. But despite their remarkable capabilities, the models’ large size creates latency and cost constraints that hinder the deployment of applications on top of them. In particular, increased inference time and memory consumption […]
Optimization approaches for Transformers [Part 2]
DeepSpeed powers 8x larger MoE model training with high performance - Microsoft Research
This AI newsletter is all you need #6 – Towards AI
PDF) DeepSpeed-MoE: Advancing Mixture-of-Experts Inference and Training to Power Next-Generation AI Scale
Scaling laws for very large neural nets — The Dan MacKinlay stable of variably-well-consider'd enterprises
Latest News - DeepSpeed
DeepSpeed/docs/index.md at master · microsoft/DeepSpeed · GitHub
This AI newsletter is all you need #6, by Towards AI Editorial Team
🗜🗜Edge#226: DeepSpeed Compression, a new library for extreme compression of deep learning models