×

OptNet

swMATH ID: 42660
Software Authors: Brandon Amos, J. Zico Kolter
Description: OptNet: Differentiable Optimization as a Layer in Neural Networks. This paper presents OptNet, a network architecture that integrates optimization problems (here, specifically in the form of quadratic programs) as individual layers in larger end-to-end trainable deep networks. These layers encode constraints and complex dependencies between the hidden states that traditional convolutional and fully-connected layers often cannot capture. We explore the foundations for such an architecture: we show how techniques from sensitivity analysis, bilevel optimization, and implicit differentiation can be used to exactly differentiate through these layers and with respect to layer parameters; we develop a highly efficient solver for these layers that exploits fast GPU-based batch solves within a primal-dual interior point method, and which provides backpropagation gradients with virtually no additional cost on top of the solve; and we highlight the application of these approaches in several problems. In one notable example, the method is learns to play mini-Sudoku (4x4) given just input and output games, with no a-priori information about the rules of the game; this highlights the ability of OptNet to learn hard constraints better than other neural architectures.
Homepage: https://arxiv.org/abs/1703.00443
Source Code:  https://github.com/locuslab/optnet
Dependencies: Python
Related Software: Adam; PyTorch; CVXGEN; Julia; DiffSharp; torchdiffeq; TensorFlow; RMSprop; AdaGrad; ImageNet; L-BFGS; ALTRO; Dojo; DiffPills; HYPLAS; BigGAN; Wasserstein GAN; Flow++; POT; pix2pix
Cited in: 7 Publications

Citations by Year