Linguistics and English Language

Language evolution seminar

Speaker: Shangmin Guo (University of Edinburgh)

Title: A Possible Form of Transmission Phase for Deep Learning Agents

Abstract: In this work, we aim to explore a form of transmission bottleneck for iterated learning on deep learning agents. We start from approximating how learning one sample would modify the prediction on the other one, via first-order Taylor approximation. Through analysing the terms involved in the results, we show that empirical neural tangent kernel (eNTK) exposes the interactions between samples, and find that the signs from labels also influence the interactions. Therefore, we propose the labelled pseudo neural tangent kernel (lpNTK) which can take the label information into consideration when measuring the interactions between samples. We first prove that lpNTK would asymptotically converge to the empirical NTK in terms of Frobenius norm under certain assumption, and illustrate how lpNTK could help to understand learning phenomena found by the previous works, specifically the learning difficulty of samples and forgetting events during learning. Moreover, we also show that lpNTK can help to improve the generalisation performance of neural network models on both image classification and deep reinforcement learning, thus can form a transmission bottleneck.

Contact

Seminars are organised by the Centre for Language Evolution

Oct 11 2022 -

Language evolution seminar

2022-10-11: A Possible Form of Transmission Phase for Deep Learning Agents

Room LG.10, 40 George Square, Edinburgh, EH8 9JX; online via link invitation