Implementing Compositional Learning

Master's Defense
Speaker Name
Dmitry Vagner
Date and Time

We present work on the initial stages of a type-safe deep learning library, written in the \Idris{} programming language. In particular, we provide several modules---each meant to exist as distinct domain-specific languages---that correspond to the different levels of abstraction involved in specifying neural network architectures. In particular, we present three such modules. The first is an implementation of any higher-order composition function, which we call a flow. The second is an implementation of a learner, an abstract formulation---affording full compositionality---of supervised learning algorithms, as introduced in the paper "Backprop as Functor". The third is a type-safe library for defining and manipulating arbitrary rank tensors. Approaching machine learning from this perspective---dubbed "metalinguistic abstraction" in the classical text "Structure and Interpretation of Computer Programs"---would hopefully provide the relatively nascent deep learning discipline with the sort of structural principles that have proven invaluable for writing traditional software.

Advisor: Sayan Mukherjee Committee: Vincent Conitzer, Ezra Miller