Support for Knowledge Distillation #1187
Unanswered
thangld201
asked this question in
Ideas
Replies: 2 comments 1 reply
-
@mzr1996 Can you help me ? |
Beta Was this translation helpful? Give feedback.
1 reply
-
We are planning to introduce some simple knowledge methods to mmcls in the future. But now, you can try to use MMRazor, it's a professional model compression tool for OpenMMLab. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, I'm looking for a way to distil from ViT to MobilenetV2, given the checkpoints trained using mmclassification. This would involve using 2 dataloaders & models at the same time with a distillation loss (e.g. KLDivergence). I was looking for a way to initialize dataloader and model separately, without using Runner (since this is not supported), such that model's forward outputs include logits given a batch drawn from dataloader, but haven't figured out how to do so. Does anyone have any ideas ? I'm new to mmclassification.
Beta Was this translation helpful? Give feedback.
All reactions