Off-the-shelf Gaussian Process (GP) covariance functions encode smoothnessassumptions on the structure of the function to be modeled. To model complexand non-differentiable functions, these smoothness assumptions are often toorestrictive. One way to alleviate this limitation is to find a differentrepresentation of the data by introducing a feature space. This feature spaceis often learned in an unsupervised way, which might lead to datarepresentations that are not useful for the overall regression task. In thispaper, we propose Manifold Gaussian Processes, a novel supervised method thatjointly learns a transformation of the data into a feature space and a GPregression from the feature space to observed space. The Manifold GP is a fullGP and allows to learn data representations, which are useful for the overallregression task. As a proof-of-concept, we evaluate our approach on complexnon-smooth functions where standard GPs perform poorly, such as step functionsand robotics tasks with contacts.
translated by 谷歌翻译