ePrints@IIScePrints@IISc Home | About | Browse | Latest Additions | Advanced Search | Contact | Help

Multi-source Subnetwork-level Transfer in CNNs Using Filter-Trees

Kumaraswamy, SK and Sastr, P and Ramakrishnan, K (2018) Multi-source Subnetwork-level Transfer in CNNs Using Filter-Trees. In: International Joint Conference on Neural Networks, IJCNN 2018, 8 - 13 July 2018, Rio de Janeiro.

[img] PDF
IEEE_IJCNN_2018.pdf - Published Version
Restricted to Registered users only

Download (7MB) | Request a copy
Official URL: https://doi.org/10.1109/IJCNN.2018.8489678

Abstract

Convolutional Neural Networks (CNNs) are very effective for many pattern recognition tasks. However, training deep CNNs needs extensive computation and large training data. In this paper we propose Bank of Filter-Trees (BFT) as a transfer learning mechanism for improving efficiency of learning CNNs. A filter-tree corresponding to a filter in k th convolutional layer of a CNN is a subnetwork consisting of the filter along with all its connections to filters in all preceding layers. An ensemble of such filter-trees created from many CNNs learnt on different but related tasks, forms the BFT. To learn a new CNN, we sample from the BFT to select a set of filter trees. This fixes the first few layers of the target net and only the remaining network would be learnt using training data of new task. Through simulations we demonstrate the effectiveness of this idea of BFT. This method constitutes a novel transfer learning technique where transfer is at a subnetwork level; transfer can be effected from multiple source networks, the number of weights to be learnt is same as a single CNN; and, with no finetuning of the transferred weights, the performance achieved is quite good. In all our experiments the number of filter trees sampled is kept same as the number of filters in the k th layer of the new CNN. This is not a limitation it is just to keep the number of filters freshly learnt in the subsequent layers equal to a single CNN for a fair comparison.

Item Type: Conference Paper
Publication: Proceedings of the International Joint Conference on Neural Networks
Publisher: Institute of Electrical and Electronics Engineers Inc.
Additional Information: The copyright for this article belongs to the IEEE.
Keywords: Convolution; Neural networks; Pattern recognition, Convolutional neural network; Improving efficiency; Multi-Sources; Multiple source; Sub-network; Training data; Transfer learning, Forestry
Department/Centre: Division of Electrical Sciences > Electrical Engineering
Date Deposited: 03 Aug 2022 06:32
Last Modified: 03 Aug 2022 06:32
URI: https://eprints.iisc.ac.in/id/eprint/75206

Actions (login required)

View Item View Item