TY - GEN
T1 - Model transport
T2 - 27th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2014
AU - Freifeld, Oren
AU - Hauberg, Søren
AU - Black, Michael J.
N1 - Publisher Copyright: © 2014 IEEE.
PY - 2014/9/24
Y1 - 2014/9/24
N2 - We consider the intersection of two research fields: transfer learning and statistics on manifolds. In particu- lar, we consider, for manifold-valued data, transfer learn- ing of tangent-space models such as Gaussians distribu- tions, PCA, regression, or classifiers. Though one would hope to simply use ordinary Rn-transfer learning ideas, the manifold structure prevents it. We overcome this by basing our method on inner-product-preserving parallel transport, a well-known tool widely used in other problems of statis- tics on manifolds in computer vision. At first, this straight- forward idea seems to suffer from an obvious shortcom- ing: Transporting large datasets is prohibitively expensive, hindering scalability. Fortunately, with our approach, we never transport data. Rather, we show how the statistical models themselves can be transported, and prove that for the tangent-space models above, the transport 'commutes' with learning. Consequently, our compact framework, ap- plicable to a large class of manifolds, is not restricted by the size of either the training or test sets. We demonstrate the approach by transferring PCA and logistic-regression models of real-world data involving 3D shapes and image descriptors.
AB - We consider the intersection of two research fields: transfer learning and statistics on manifolds. In particu- lar, we consider, for manifold-valued data, transfer learn- ing of tangent-space models such as Gaussians distribu- tions, PCA, regression, or classifiers. Though one would hope to simply use ordinary Rn-transfer learning ideas, the manifold structure prevents it. We overcome this by basing our method on inner-product-preserving parallel transport, a well-known tool widely used in other problems of statis- tics on manifolds in computer vision. At first, this straight- forward idea seems to suffer from an obvious shortcom- ing: Transporting large datasets is prohibitively expensive, hindering scalability. Fortunately, with our approach, we never transport data. Rather, we show how the statistical models themselves can be transported, and prove that for the tangent-space models above, the transport 'commutes' with learning. Consequently, our compact framework, ap- plicable to a large class of manifolds, is not restricted by the size of either the training or test sets. We demonstrate the approach by transferring PCA and logistic-regression models of real-world data involving 3D shapes and image descriptors.
KW - Computer Vision
KW - Manifold-Valued Data
KW - PGA
KW - Riemannian Manifolds
KW - Scalable
KW - Statistics on Manifolds
KW - Transfer Learning
UR - http://www.scopus.com/inward/record.url?scp=84911412369&partnerID=8YFLogxK
U2 - 10.1109/CVPR.2014.179
DO - 10.1109/CVPR.2014.179
M3 - Conference contribution
T3 - Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
SP - 1378
EP - 1385
BT - Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
Y2 - 23 June 2014 through 28 June 2014
ER -