LieTransformer: Equivariant self-attention for Lie Groups
release_kzafci4bpfgrzodjpvobfmneee
by
Michael Hutchinson, Charline Le Lan, Sheheryar Zaidi, Emilien Dupont, Yee Whye Teh, Hyunjik Kim
2021
Abstract
Group equivariant neural networks are used as building blocks of group
invariant neural networks, which have been shown to improve generalisation
performance and data efficiency through principled parameter sharing. Such
works have mostly focused on group equivariant convolutions, building on the
result that group equivariant linear maps are necessarily convolutions. In this
work, we extend the scope of the literature to self-attention, that is emerging
as a prominent building block of deep learning models. We propose the
LieTransformer, an architecture composed of LieSelfAttention layers that are
equivariant to arbitrary Lie groups and their discrete subgroups. We
demonstrate the generality of our approach by showing experimental results that
are competitive to baseline methods on a wide range of tasks: shape counting on
point clouds, molecular property regression and modelling particle trajectories
under Hamiltonian dynamics.
In text/plain
format
Archived Files and Locations
application/pdf 1.4 MB
file_nit4fmcpuvejxnlrxpqzzkfs2i
|
arxiv.org (repository) web.archive.org (webarchive) |
2012.10885v4
access all versions, variants, and formats of this works (eg, pre-prints)