Differentiable Programming for Piecewise Polynomial Functions
Date
Authors
Major Professor
Advisor
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
Authors
Research Projects
Organizational Units
Journal Issue
Is Version Of
Versions
Series
Department
Abstract
We introduce a new, principled approach to extend gradient-based optimization to piecewise smooth models, such as k-histograms, splines, and segmentation maps. We derive an accurate form of the weak Jacobian of such functions and show that it exhibits a block-sparse structure that can be computed implicitly and efficiently. We show that using the redesigned Jacobian leads to improved performance in applications such as denoising with piecewise polynomial regression models, datafree generative model training, and image segmentation.
Comments
This proceeding is published as Cho, Minsu, Ameya Joshi, Xian Yeow Lee, Aditya Balu, Adarsh Krishnamurthy, Baskar Ganapathysubramanian, Soumik Sarkar, and Chinmay Hegde. "Differentiable Programming for Piecewise Polynomial Functions." NeurIPS Thirty-fourth Annual Conference on Neural Information Processing Systems. Learning Meets Combinatorial Algorithms (LMCA): Workshop at NeurIPS 2020. December 6-12, 2020. Posted with permission.