Paper
Paper
Search...
Search ResearchHub...
Ctrl+K
New
Home
Browse
Earn
Fund
RH Journal
Notebook
Lists
Leaderboard
RSC
USD
Changelog
Terms
Privacy
Issues
Docs
Support
Foundation
About
Knowledge Distillation with Perturbed Loss: From a Vanill... | ResearchHub
Paper
Paper
Search...
Search ResearchHub...
Ctrl+K
New
Home
Browse
Earn
Fund
RH Journal
Notebook
Lists
Leaderboard
RSC
USD
Changelog
Terms
Privacy
Issues
Docs
Support
Foundation
About
Knowledge Distillation with Perturbed Loss: From a Vanilla Teacher to a Proxy Teacher
0
Authors
Rongzhi Zhang
6 more
Rongzhi Zhang
•
Jiaming Shen
4 more
•
Chao Zhang
Published
August 24, 2024
Paper
Conversation
0
Reviews
0
Bounties
0
Sign in to comment
Add a comment...
Best
Supporters
Support the authors with ResearchCoin
Tip RSC
Topics
Computer Science
Machine Learning
Chemistry
Artificial Intelligence
Chromatography
Show all topics
DOI
10.1145/3637528.3671851
License
CC-BY
Other Formats
PDF
Supporters
Support the authors with ResearchCoin
Tip RSC
Topics
Computer Science
Machine Learning
Chemistry
Artificial Intelligence
Chromatography
Show all topics
DOI
10.1145/3637528.3671851
License
CC-BY
Other Formats
PDF