Skip to content
/ ICKD Public
forked from ADLab-AutoDrive/ICKD

Offical Code for Paper "Exploring Inter-Channel Correlation for Diversity-preserved Knowledge Distillation"

Notifications You must be signed in to change notification settings

ADLab3Ds/ICKD

This branch is up to date with ADLab-AutoDrive/ICKD:main.

Folders and files

NameName
Last commit message
Last commit date

Latest commit

author
luis.ll
Jan 19, 2022
403b8d3 · Jan 19, 2022

History

4 Commits
Jan 19, 2022
Jun 18, 2021
Jun 18, 2021
Jun 18, 2021
Jun 18, 2021

Repository files navigation

ICKD

This repository provides code of our paper "Exploring Inter-Channel Correlation for Diversity-preserved Knowledge Distillation". We provide training code on Cifar100 and evaluation code on ImageNet & Pascal VOC. The remaining training code will be released after the paper is accepted.

Preparation

  1. Download all checkpoints from https://drive.google.com/drive/folders/1ZvwEAVJurTXSuPL_0HylHMNUGbG1zl3U?usp=sharing
  2. Build a docker image using the provided Dockerfile. All code should be run in the docker image.

Running

Go into directory of each task and following the README there.

License

The License will be updated after the code is released without anonymity.

Acknowledgement

This work is built on three different repository, RepDistiller(ICLR 2020), torchdistill(ICPR 2020) and OverHaul(ICCV 2019). Thanks to their great work.

About

Offical Code for Paper "Exploring Inter-Channel Correlation for Diversity-preserved Knowledge Distillation"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 56.7%
  • Python 42.9%
  • Other 0.4%