TY - JOUR
T1 - Handling Hard Affine SDP Shape Constraints in RKHSs
AU - Aubin-Frankowski, Pierre Cyril
AU - Szabó, Zoltán
N1 - Publisher Copyright:
©2022 Pierre-Cyril Aubin-Frankowski and Zoltán Szabó.
PY - 2022/10/1
Y1 - 2022/10/1
N2 - Shape constraints, such as non-negativity, monotonicity, convexity or supermodularity, play a key role in various applications of machine learning and statistics. However, incorporating this side information into predictive models in a hard way (for example at all points of an interval) for rich function classes is a notoriously challenging problem. We propose a unified and modular convex optimization framework, relying on second-order cone (SOC) tightening, to encode hard affine SDP constraints on function derivatives, for models belonging to vector-valued reproducing kernel Hilbert spaces (vRKHSs). The modular nature of the proposed approach allows to simultaneously handle multiple shape constraints, and to tighten an infinite number of constraints into finitely many. We prove the convergence of the proposed scheme and that of its adaptive variant, leveraging geometric properties of vRKHSs. Due to the covering-based construction of the tightening, the method is particularly well-suited to tasks with small to moderate input dimensions. The efficiency of the approach is illustrated in the context of shape optimization, safety-critical control, robotics and econometrics.
AB - Shape constraints, such as non-negativity, monotonicity, convexity or supermodularity, play a key role in various applications of machine learning and statistics. However, incorporating this side information into predictive models in a hard way (for example at all points of an interval) for rich function classes is a notoriously challenging problem. We propose a unified and modular convex optimization framework, relying on second-order cone (SOC) tightening, to encode hard affine SDP constraints on function derivatives, for models belonging to vector-valued reproducing kernel Hilbert spaces (vRKHSs). The modular nature of the proposed approach allows to simultaneously handle multiple shape constraints, and to tighten an infinite number of constraints into finitely many. We prove the convergence of the proposed scheme and that of its adaptive variant, leveraging geometric properties of vRKHSs. Due to the covering-based construction of the tightening, the method is particularly well-suited to tasks with small to moderate input dimensions. The efficiency of the approach is illustrated in the context of shape optimization, safety-critical control, robotics and econometrics.
KW - kernel derivatives
KW - matrix-valued kernel
KW - shape-constrained optimization
KW - vector-valued reproducing kernel Hilbert space
UR - https://www.scopus.com/pages/publications/85148071943
M3 - Article
AN - SCOPUS:85148071943
SN - 1532-4435
VL - 23
JO - Journal of Machine Learning Research
JF - Journal of Machine Learning Research
M1 - 297
ER -