ePrints@IIScePrints@IISc Home | About | Browse | Latest Additions | Advanced Search | Contact | Help

Fast Graph Convolutional Recurrent Neural Networks

Kadambari, SK and Prabhakar Chepuri, S (2019) Fast Graph Convolutional Recurrent Neural Networks. In: 2019 53rd Asilomar Conference on Signals, Systems, and Computers, 3-6 Nov. 2019, Pacific Grove, CA, USA, USA, pp. 467-471.

[img] PDF
ASI_CON_SIG_SYS_COM_467-471_2019.pdf - Published Version
Restricted to Registered users only

Download (700kB) | Request a copy
Official URL: https://dx.doi.org/10.1109/IEEECONF44664.2019.9048...


This paper proposes a Fast Graph Convolutional Neural Network (FGRNN) architecture to predict sequences with an underlying graph structure. The proposed architecture addresses the limitations of the standard recurrent neural network (RNN), namely, vanishing and exploding gradients, causing numerical instabilities during training. State-of-the-art architectures that combine gated RNN architectures, such as Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) with graph convolutions are known to improve the numerical stability during the training phase, but at the expense of the model size involving a large number of training parameters. FGRNN addresses this problem by adding a weighted residual connection with only two extra training parameters as compared to the standard RNN. Numerical experiments on the real 3D point cloud dataset corroborates the proposed architecture. © 2019 IEEE.

Item Type: Conference Paper
Publication: Conference Record - Asilomar Conference on Signals, Systems and Computers
Publisher: IEEE Computer Society
Additional Information: cited By 0; Conference of 53rd Asilomar Conference on Circuits, Systems and Computers, ACSSC 2019 ; Conference Date: 3 November 2019 Through 6 November 2019; Conference Code:158954
Keywords: Computer circuits; Convolution; Convolutional neural networks; Graph structures; Network architecture, Numerical experiments; Numerical instability; Proposed architectures; Recurrent neural network (RNN); State of the art; Training parameters; Underlying graphs; Weighted residuals, Long short-term memory
Department/Centre: Division of Electrical Sciences > Electrical Communication Engineering
Date Deposited: 07 Sep 2020 06:24
Last Modified: 07 Sep 2020 06:24
URI: http://eprints.iisc.ac.in/id/eprint/65257

Actions (login required)

View Item View Item