File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

정창욱

Jeong, Changwook
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

PAC-Net: A Model Pruning Approach to Inductive Transfer Learning

Author(s)
Myung, SanghoonHuh, InJang, WonikChoe, Jae MyungRyu, JisuKim, DaesinKim, Kee-EungJeong, Changwook
Issued Date
2022-07-17
URI
https://scholarworks.unist.ac.kr/handle/201301/75696
Fulltext
https://icml.cc/virtual/2022/poster/17175
Citation
International Conference on Machine Learning
Abstract
Inductive transfer learning aims to learn from a small amount of training data for the target task by utilizing a pre-trained model from the source task. Most strategies that involve large-scale deep learning models adopt initialization with the pre-trained model and fine-tuning for the target task. However, when using over-parameterized models, we can often prune the model without sacrificing the accuracy of the source task. This motivates us to adopt model pruning for transfer learning with deep learning models. In this paper, we propose PAC-Net, a simple yet effective approach for transfer learning based on pruning. PAC-Net consists of three steps: Prune, Allocate, and Calibrate (PAC). The main idea behind these steps is to identify essential weights for the source task, fine-tune on the source task by updating the essential weights, and then calibrate on the target task by updating the remaining redundant weights. Under the various and extensive set of inductive transfer learning experiments, we show that our method achieves state-of-the-art performance by a large margin.
Publisher
ICML

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.