Structured Representation Learning

£34.99

Structured Representation Learning

From Homomorphisms and Disentanglement to Equivariance and Topography

Mathematical modelling Computer science Artificial intelligence Machine learning Computer vision Image processing

Authors: Yue Song, Thomas Anderson Keller, Nicu Sebe, Max Welling

Dinosaur mascot

Collection: Synthesis Lectures on Computer Vision

Language: English

Published by: Springer

Published on: 18th May 2025

Format: LCP-protected ePub

ISBN: 9783031881114


Introduction

This book introduces approaches to generalize the benefits of equivariant deep learning to a broader set of learned structures through learned homomorphisms. In the field of machine learning, the idea of incorporating knowledge of data symmetries into artificial neural networks is known as equivariant deep learning and has led to the development of cutting edge architectures for image and physical data processing. The power of these models originates from data-specific structures ingrained in them through careful engineering. To-date however, the ability for practitioners to build such a structure into models is limited to situations where the data must exactly obey specific mathematical symmetries. The authors discuss naturally inspired inductive biases, specifically those which may provide types of efficiency and generalization benefits through what are known as homomorphic representations, a new general type of structured representation inspired from techniques in physics and neuroscience. A review of some of the first attempts at building models with learned homomorphic representations are introduced. The authors demonstrate that these inductive biases improve the ability of models to represent natural transformations and ultimately pave the way to the future of efficient and effective artificial neural networks.

Show moreShow less