Abstract
Neural networks have become an increasingly popular tool for solving many
real-world problems. They are a general framework for differentiable
optimization which includes many other machine learning approaches as special
cases. In this thesis we build a category-theoretic formalism around a class of
neural networks exemplified by CycleGAN. CycleGAN is a collection of neural
networks, closed under composition, whose inductive bias is increased by
enforcing composition invariants, i.e. cycle-consistencies. Inspired by
Functorial Data Migration, we specify the interconnection of these networks
using a categorical schema, and network instances as set-valued functors on
this schema. We also frame neural network architectures, datasets, models, and
a number of other concepts in a categorical setting and thus show a special
class of functors, rather than functions, can be learned using gradient
descent. We use the category-theoretic framework to conceive a novel neural
network architecture whose goal is to learn the task of object insertion and
object deletion in images with unpaired data. We test the architecture on three
different datasets and obtain promising results.
Users
Please
log in to take part in the discussion (add own reviews or comments).