alphaXiv

Explore

State of the Art

Sign In

Labs

Feedback

Browser Extension

We're hiring
PaperBlogResources

LLaVA-MoD: Making LLaVA Tiny via MoE Knowledge Distillation

BibTex
Copy
@misc{zhang2024llavamodmakingllava,
      title={LLaVA-MoD: Making LLaVA Tiny via MoE Knowledge Distillation}, 
      author={Lei Zhang and Hongsheng Li and Fangxun Shu and Haoyuan Li and Hao Jiang and Long Chen and Si Liu and Wanggui He and Siming Fu and Zhelun Yu and Le Zhuo and Yue Liao and Tao Zhong and Haonan Shi and Chenning Xu and Guanghao Zhang and Bolin Li},
      year={2024},
      eprint={2408.15881},
      archivePrefix={arXiv},
      primaryClass={cs.CV},
      url={https://arxiv.org/abs/2408.15881}, 
}
GitHub
LLaVA-MoD
86
HTTPS
https://github.com/shufangxun/LLaVA-MoD
SSH
git@github.com:shufangxun/LLaVA-MoD.git
CLI
gh repo clone shufangxun/LLaVA-MoD
Transform this paper into an audio lecture
Get an engaging lecture and Q&A format to quickly understand the paper in minutes, perfect for learning on the go.
Audio lecture
Q&A format