Skip to main content Accessibility help
×
Hostname: page-component-848d4c4894-5nwft Total loading time: 0 Render date: 2024-05-18T06:20:50.898Z Has data issue: false hasContentIssue false

References

Published online by Cambridge University Press:  09 March 2023

Nicolas Boumal
Affiliation:
École Polytechnique Fédérale de Lausanne
Get access

Summary

Image of the first page of this content. For PDF version, please use the ‘Save PDF’ preceeding this image.'
Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2023

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

[Abb84] Abbott, E.. Flatland: A Romance of Many Dimensions. Seeley & Co., 1884. [see p. 2]Google Scholar
[ABBC20] Agarwal, N., Boumal, N., Bullins, B., and Cartis, C.. Adaptive regularization with cubics on manifolds. Mathematical Programming, 188(1):85134, 2020. [see pp. 148, 283, 294]Google Scholar
[ABG07] Absil, P.-A., Baker, C.G., and Gallivan, K.A.. Trust-region methods on Riemannian manifolds. Foundations of Computational Mathematics, 7(3):303330, 2007. [see pp. 140, 142, 147]CrossRefGoogle Scholar
[ABM08] Alvarez, F., Bolte, J., and Munier, J.. A unifying local convergence result for Newton’s method in Riemannian manifolds. Foundations of Computational Mathematics, 8(2):197226, April 2008. [see p. 294]Google Scholar
[ADM+02] Adler, R.L., Dedieu, J.P., Margulies, J.Y., Martens, M., and Shub, M.. Newton’s method on Riemannian manifolds and a geometric model for the human spine. IMA Journal of Numerical Analysis, 22(3):359390, 2002. [see p. 146]CrossRefGoogle Scholar
[AFPA07] Arsigny, V., Fillard, P., Pennec, X., and Ayache, N.. Geometric means in a novel vector space structure on symmetric positive-definite matrices. SIAM Journal on Matrix Analysis and Applications, 29(1):328347, 2007. [see p. 318]CrossRefGoogle Scholar
[AK06] Absil, P.-A. and Kurdyka, K.. On the stable equilibrium points of gradient systems. Systems & Control Letters, 55(7):573577, July 2006. [see p. 78]CrossRefGoogle Scholar
[AM12] Absil, P.-A. and Malick, J.. Projection-like retractions on matrix manifolds. SIAM Journal on Optimization, 22(1):135158, 2012. [see pp. 111, 114, 156]CrossRefGoogle Scholar
[AMH09] Al-Mohy, A. and Higham, N.. Computing the fréchet derivative of the matrix exponential, with an application to condition number estimation. SIAM Journal on Matrix Analysis and Applications, 30(4):16391657, 2009. [see p. 71]Google Scholar
[AMS08] Absil, P.-A., Mahony, R., and Sepulchre, R.. Optimization Algorithms on Matrix Manifolds. Princeton University Press, 2008. [see pp. xiii, 48, 65, 72, 77, 78, 113, 137, 142, 143, 146, 147, 156, 157, 174, 203, 204, 225, 246, 247, 277, 281, 294, 295]CrossRefGoogle Scholar
[AMT13] Absil, P.-A., Mahony, R., and Trumpf, J.. An extrinsic look at the Riemannian Hessian. In Nielsen, Frank and Frédéric Barbaresco, editors, Geometric Science of Information, volume 8085 of Lecture Notes in Computer Science, pages 361368. Springer, 2013. [see pp. 113 and 245]CrossRefGoogle Scholar
[AO15] Absil, P.-A. and Oseledets, I.V.. Low-rank retractions: a survey and new results. Computational Optimization and Applications, 62(1):529, 2015. [see p. 164]Google Scholar
[AOBL20a] Alimisis, F., Orvieto, A., Bécigneul, G., and Lucchi, A.. A continuous-time perspective for modeling acceleration in Riemannian optimization. In Chiappa, S. and Calandra, R., editors, Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, volume 108 of Proceedings of Machine Learning Research, pages 12971307. PMLR, 2020. [see p. 320]Google Scholar
[AOBL20b] Alimisis, F., Orvieto, A., Bécigneul, G., and Lucchi, A.. Practical accelerated optimization on Riemannian manifolds. arXiv 2002.04144, 2020. [see p. 320]Google Scholar
[AS20] Ahn, K. and Sra, S.. From Nesterov’s estimate sequence to Riemannian acceleration. In Conference on Learning Theory, pages 84118. PMLR, 2020. [see p. 320]Google Scholar
[ASS+09] Agarwal, S., Snavely, N., Simon, I., Seitz, S.M., and Szeliski, R.. Building Rome in a day. In Computer Vision, 2009 IEEE 12th International Conference on, pages 7279. IEEE, 2009. [see p. 11]CrossRefGoogle Scholar
[AZGL+18] Allen-Zhu, Z., Garg, A., Li, Y., Oliveira, R., and Wigderson, A.. Operator scaling via geodesically convex optimization, invariant theory and polynomial identity testing. In Proceedings of the 50th Annual ACM SIGACT Symposium on Theory of Computing (STOC), pages 172181, 2018. [see p. 298]CrossRefGoogle Scholar
[Bac14] Bacák, M.. Convex Analysis and Optimization in Hadamard Spaces, volume 22 of De Gruyter Series in Nonlinear Analysis and Applications. Walter de Gruyter GmbH & Co KG, 2014. [see p. 293]Google Scholar
[BAC18] Boumal, N., Absil, P.-A., and Cartis, C.. Global rates of convergence for nonconvex optimization on manifolds. IMA Journal of Numerical Analysis, 39(1):133, February 2018. [see pp. 78 and 147]CrossRefGoogle Scholar
[BAJN20] Berger, G.O., Absil, P.-A., Jungers, R.M., and Nesterov, Y.. On the quality of firstorder approximation of functions with Hölder continuous gradient. Journal of Optimization Theory and Applications, 185(1):1733, 2020. [see p. 78]CrossRefGoogle Scholar
[Bar95] Barvinok, A.I. Problems of distance geometry and convex properties of quadratic maps. Discrete & Computational Geometry, 13(1):189202, 1995. [see p. 14]Google Scholar
[BBJN18] Bhojanapalli, S., Boumal, N., Jain, P., and Netrapalli, P.. Smoothed analysis for lowrank solutions to semidefinite programs in quadratic penalty form. In Bubeck, S., Perchet, V., and Rigollet, P., editors, Proceedings of the 31st Conference on Learning Theory, volume 75 of Proceedings of Machine Learning Research, pages 32433270. PMLR, 0609 Jul 2018. [see p. 15]Google Scholar
[BBV16] Bandeira, A.S., Boumal, N., and Voroninski, V.. On the low-rank approach for semidefinite programs arising in synchronization and community detection. In Proceedings of the 29th Conference on Learning Theory, COLT 2016, New York, NY, June 23–26, 2016. [see p. 15]Google Scholar
[BC70] Brickell, F. and Clark, R.S.. Differentiable Manifolds: An Introduction. Van Nostrand Reinhold, 1970. [see pp. xiv, 2, 176, 183, 184, 185, 203, 246]Google Scholar
[Ber22] Bergmann, R.. Manopt.jl: Optimization on manifolds in Julia. Journal of Open Source Software, 7(70):3866, 2022. [see p. 149]Google Scholar
[Bes87] Besse, A.L. Einstein Manifolds. Springer, 1987. [see p. 249]Google Scholar
[BFM17] Bento, G.C., Ferreira, O.P., and Melo, J.G.. Iteration-complexity of gradient, subgradient and proximal point methods on Riemannian manifolds. Journal of Optimization Theory and Applications, 173(2):548562, 2017. [see pp. 78 and 320]Google Scholar
[BH15] Bodmann, B.G. and Haas, J.. Frame potentials and the geometry of frames. Journal of Fourier Analysis and Applications, 21(6):13441383, May 2015. [see pp. 78 and 245]Google Scholar
[BH19] Bergmann, R. and Herzog, R.. Intrinsic formulation of KKT conditions and constraint qualifications on smooth manifolds. SIAM Journal on Optimization, 29(4):24232444, 2019. [see pp. 78 and 146]CrossRefGoogle Scholar
[Bha07] Bhatia, R.. Positive Definite Matrices. Princeton University Press, 2007. [see pp. 318 and 319]Google Scholar
[BHSL+21] Bergmann, R., Herzog, R., Silva Louzeiro, M., Tenbrinck, D., and Vidal-Núñez, J.. Fenchel duality theory and a primal-dual algorithm on Riemannian manifolds. Foundations of Computational Mathematics, 21(6):14651504, 2021. [see p. 320]CrossRefGoogle Scholar
[BKVH07] Boyd, S., Kim, S.-J., Vandenberghe, L., and Hassibi, A.. A tutorial on geometric programming. Optimization and Engineering, 8(1):67127, 2007. [see pp. 298 and 316]CrossRefGoogle Scholar
[BM03] Burer, S. and Monteiro., R.D.C. A nonlinear programming algorithm for solving semidefinite programs via low-rank factorization. Mathematical Programming, 95(2):329357, 2003. [see p. 15]Google Scholar
[BM05] Burer, S. and Monteiro., R.D.C. Local minima and convergence in low-rank semidefinite programming. Mathematical Programming, 103(3):427444, 2005. [see pp. 14 and 15]CrossRefGoogle Scholar
[BM06] Brace, I. and Manton, J.H.. An improved BFGS-on-manifold algorithm for computing weighted low rank approximations. In Proceedings of the 17th International Symposium on Mathematical Theory of Networks and Systems, pages 17351738, 2006. [see p. 78]Google Scholar
[BMAS14] Boumal, N., Mishra, B., Absil, P.-A., and Sepulchre, R.. Manopt, a Matlab toolbox for optimization on manifolds. Journal of Machine Learning Research, 15(42):14551459, 2014. [see p. 149]Google Scholar
[Bon13] Bonnabel, S.. Stochastic gradient descent on Riemannian manifolds. Automatic Control, IEEE Transactions on, 58(9):22172229, 2013. [see p. 78]Google Scholar
[BV21] Breiding, P. and Vannieuwenhoven, N.. The condition number of Riemannian approximation problems. SIAM Journal on Optimization, 31(1):10491077, 2021. [see p. 114]Google Scholar
[BVB16] Boumal, N., Voroninski, V., and Bandeira, A.S.. The non-convex Burer–Monteiro approach works on smooth semidefinite programs. In Lee, D. D., Sugiyama, M., Luxburg, U. V., Guyon, I., and Garnett, R., editors, Advances in Neural Information Processing Systems 29, pages 27572765. Curran Associates, Inc., 2016. [see p. 15]Google Scholar
[BVB19] Boumal, N., Voroninski, V., and Bandeira, A.S.. Deterministic guarantees for Burer-Monteiro factorizations of smooth semidefinite programs. Communications on Pure and Applied Mathematics, 73(3):581608, 2019. [see p. 15]Google Scholar
[BZ05] Borwein, J. and Zhu, Q.. Techniques of Variational Analysis. CMS Books in Mathematics. Springer-Verlag, 2005. [see p. 50]Google Scholar
[BZA20] Bendokat, T., Zimmermann, R., and Absil, P.-A. A Grassmann manifold handbook: basic geometry and computational aspects. arXiv preprint arXiv:2011.13699, 2020. [see pp. 72, 245, 246]Google Scholar
[CB22a] Criscitiello, C. and Boumal, N.. An accelerated first-order method for nonconvex optimization on manifolds. Journal of Foundations of Computational Mathematics, 2022. [see pp. 294 and 295]CrossRefGoogle Scholar
[CB22b] Criscitiello, C. and Boumal, N.. Negative curvature obstructs acceleration for strongly geodesically convex optimization, even with exact first-order oracles. In Loh, P.-L. and Raginsky, M., editors, Proceedings of Thirty Fifth Conference on Learning Theory, volume 178 of Proceedings of Machine Learning Research, pages 496542. PMLR, 0205 Jul 2022. [see p. 320]Google Scholar
[CGT00] Conn, A.R., Gould, N.I.M., and Toint, P.L.. Trust-region methods. MPS-SIAM Series on Optimization. Society for Industrial and Applied Mathematics, 2000. [see pp. 139, 140, 141, 147]CrossRefGoogle Scholar
[CGT11a] Cartis, C., Gould, N.I.M., and Toint, P.. Adaptive cubic regularisation methods for unconstrained optimization. Part II: worst-case functionand derivativeevaluation complexity. Mathematical Programming, 130:295319, 2011. [see p. 148]Google Scholar
[CGT11b] Cartis, C., Gould, N.I.M., and Toint, P.L.. Adaptive cubic regularisation methods for unconstrained optimization. Part I: motivation, convergence and numerical results. Mathematical Programming, 127(2):245295, 2011. [see p. 148]Google Scholar
[CGT12] Cartis, C., Gould, N.I.M., and Toint, P.L.. Complexity bounds for second-order optimality in unconstrained optimization. Journal of Complexity, 28(1):93108, 2012. [see p. 147]CrossRefGoogle Scholar
[Cha06] Chavel, I.. Riemannian geometry: A Modern Introduction, volume 108 of Cambridge Tracts in Mathematics. Cambridge University Press, 2006. [see p. 294]Google Scholar
[Cif21] Cifuentes, D.. On the Burer–Monteiro method for general semidefinite programs. Optimization Letters, 15(6):22992309, 2021. [see p. 15]CrossRefGoogle Scholar
[CLR18] Curtis, F.E., Lubberts, Z., and Robinson, D.P.. Concise complexity analyses for trust region methods. Optimization Letters, 12(8):17131724, June 2018. [see p. 147]CrossRefGoogle Scholar
[CM19] Cifuentes, D. and Moitra, A.. Polynomial time guarantees for the Burer-Monteiro method. arXiv 1912.01745, 2019. [see p. 15]Google Scholar
[CMRS20] Chewi, S., Maunu, T., Rigollet, P., and Stromme, A.J.. Gradient descent algorithms for Bures–Wasserstein barycenters. In Conference on Learning Theory (COLT), pages 12761304. PMLR, 2020. [see p. 320]Google Scholar
[CMSZ20] Chen, S., Ma, S., So, A.M.C., and Zhang, T.. Proximal gradient method for nonsmooth optimization over the Stiefel manifold. SIAM Journal on Optimization, 30(1):210239, 2020. [see p. 78]Google Scholar
[dBEG08] d’Aspremont, A., Bach, F., and El Ghaoui, L.. Optimal solutions for sparse principal component analysis. The Journal of Machine Learning Research, 9:12691294, 2008. [see p. 9]Google Scholar
[dC92] do Carmo, M.P.. Riemannian geometry. Mathematics: Theory & Applications. Birkhäuser Boston Inc., 1992. Translated from the second Portuguese edition by Flaherty, Francis. [see pp. 247, 248, 295]Google Scholar
[dCN95] da Cruz Neto, J.X.. Métodos Geodésicos na Programaçao Matemática. PhD thesis, COPPE/UFRJ, Rio de Janeiro, Brazil, 1995. [see p. 294]Google Scholar
[dCNdLO98] da Cruz Neto, J.X., de Lima, L.L., and Oliveira, P.R.. Geodesic algorithms in Riemannian geometry. Balkan Journal of Geometry and Its Applications, 3(2):89100, 1998. [see pp. 78, 294, 319]Google Scholar
[DE99] Dieci, L. and Eirola, T.. On smooth decompositions of matrices. SIAM Journal on Matrix Analysis and Applications, 20(3):800819, 1999. [see p. 72]Google Scholar
[Deh95] Dehaene, J.. Continuous-time matrix algorithms, systolic algorithms and adaptive neural networks. PhD thesis, Katholieke universiteit Leuven, 1995. [see p. 72]Google Scholar
[DH94] Dudek, E. and Holly, K.. Nonlinear orthogonal projection. Annales Polonici Mathematici, 59(1):131, 1994. [see p. 114]CrossRefGoogle Scholar
[DH19] Douik, A. and Hassibi, B.. Manifold optimization over the set of doubly stochastic matrices: a second-order geometry. IEEE Transactions on Signal Processing, 67(22):57615774, November 2019. [see p. 175]CrossRefGoogle Scholar
[DMV99] Dehaene, J., Moonen, M., and Vandewalle, J.. Analysis of a class of continuoustime algorithms for principal component analysis and subspace tracking. IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications, 46(3):364372, 1999. [see p. 72]Google Scholar
[dOF20] de Oliveira, F.R. and Ferreira, O.P.. Newton method for finding a singularity of a special class of locally Lipschitz continuous vector fields on Riemannian manifolds. Journal of Optimization Theory and Applications, 185(2):522539, 2020. [see p. 294]CrossRefGoogle Scholar
[EAS98] Edelman, A., Arias, T.A., and Smith, S.T.. The geometry of algorithms with orthogonality constraints. SIAM journal on Matrix Analysis and Applications, 20(2):303353, 1998. [see pp. 2, 77, 157, 175, 243, 245]CrossRefGoogle Scholar
[FCPJ04] Fletcher, P.T., Lu, C., Pizer, S.M., and Joshi, S.. Principal geodesic analysis for the study of nonlinear statistics of shape. IEEE Transactions on Medical Imaging, 23(8):9951005, August 2004. [see p. 175]CrossRefGoogle Scholar
[Fep17] Feppon, F.. Riemannian geometry of matrix manifolds for Lagrangian uncertainty quantification of stochastic fluid flows. Master’s thesis, Massachusetts Institute of Technology, 2017. [see p. 72]Google Scholar
[FL19] Feppon, F. and Lermusiaux., P.F.J. The extrinsic geometry of dynamical systems tracking nonlinear matrix projections. SIAM Journal on Matrix Analysis and Applications, 40(2):814844, 2019. [see p. 72]Google Scholar
[Fle13] Fletcher, T.P. Geodesic regression and the theory of least squares on Riemannian manifolds. International Journal of Computer Vision, 105(2):171185, November 2013. [see p. 298]Google Scholar
[FLP20] Ferreira, O.P., Louzeiro, M.S., and Prudente, L.F.. Iteration-complexity and asymptotic analysis of steepest descent method for multiobjective optimization on Riemannian manifolds. Journal of Optimization Theory and Applications, 184:507–533, December 2020. [see p. 320]CrossRefGoogle Scholar
[FM20] Franks, C. and Moitra, A.. Rigorous guarantees for Tyler’s M-estimator via quantum expansion. In Abernethy, J. and Agarwal, S., editors, Proceedings of Thirty Third Conference on Learning Theory (COLT), volume 125 of Proceedings of Machine Learning Research, pages 16011632. PMLR, 0912 Jul 2020. [see p. 298]Google Scholar
[FO98] Ferreira, O.P. and Oliveira, P.R.. Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications, 97(1):93104, April 1998. [see p. 320]CrossRefGoogle Scholar
[FORW21] Franks, C., Oliveira, R., Ramachandran, A., and Walter, M.. Near optimal sample complexity for matrix and tensor normal models via geodesic convexity. preprint arXiv:2110.07583, 2021. [see p. 298]Google Scholar
[FS02] Ferreira, O.P. and Svaiter, B.F.. Kantorovich’s theorem on Newton’s method in Riemannian manifolds. Journal of Complexity, 18(1):304329, 2002. [see p. 294]CrossRefGoogle Scholar
[Gab82] Gabay, D.. Minimizing a differentiable function over a differential manifold. Journal of Optimization Theory and Applications, 37(2):177219, 1982. [see pp. 2 and 77]CrossRefGoogle Scholar
[GH16] Grohs, P. and Hosseini, S.. ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics, 42(2):333360, 2016. [see p. 320]Google Scholar
[GHL04] Gallot, S., Hulin, D., and LaFontaine, J.. Riemannian Geometry. Springer Verlag, 2004. [see pp. 240, 247, 249, 294]Google Scholar
[GQ20] Gallier, J. and Quaintance, J.. Differential Geometry and Lie Groups. Springer International Publishing, 2020. [see p. 247]CrossRefGoogle Scholar
[Gri81] Griewank, A.. The modification of Newton’s method for unconstrained optimization by bounding cubic terms. Technical Report Technical report NA/12, Department of Applied Mathematics and Theoretical Physics, University of Cambridge, 1981. [see p. 148]Google Scholar
[GW08] Griewank, A. and Walther, A.. Evaluating Derivatives: Principles and Techniques of Algorithmic Differentiation. Society for Industrial and Applied Mathematics, 2nd edition, 2008. [see p. 75]Google Scholar
[GZAL14] Goes, J., Zhang, T., Arora, R., and Lerman, G.. Robust stochastic principal component analysis. In Kaski, S. and Corander, J., editors, Proceedings of the 17th International Conference on Artificial Intelligence and Statistics, pages volume 33 of Proceedings of Machine Learning Research, pages 266274, 2014. [see p. 9]Google Scholar
[HAG16] Huang, W., Absil, P.-A., and Gallivan, K.A.. A Riemannian BFGS Method for Nonconvex Optimization Problems, pages 627634. Springer International Publishing, Cham, 2016. [see p. 78]CrossRefGoogle Scholar
[HGA15] Huang, W., Gallivan, K.A., and Absil, P.-A. A Broyden class of quasiNewton methods for Riemannian optimization. SIAM Journal on Optimization, 25(3):16601685, 2015. [see pp. 78, 281, 295]Google Scholar
[Hig08] Higham, N.. Functions of Matrices. Society for Industrial and Applied Mathematics, 2008. [see pp. 71 and 75]Google Scholar
[HLV18] Hand, P., Lee, C., and Voroninski, V.. ShapeFit: exact location recovery from corrupted pairwise directions. Communications on Pure and Applied Mathematics, 71(1):350, 2018. [see p. 5]Google Scholar
[HM96] Helmke, U. and Moore, J.B.. Optimization and Dynamical Systems. Springer Science & Business Media, 1996. [see pp. 2 and 77]Google Scholar
[HM21] Hamilton, L. and Moitra, A.. A no-go theorem for acceleration in the hyperbolic plane. In Advances in Neural Information Processing Systems (NeurIPS), 2021. [see p. 320]Google Scholar
[HS15] Hosseini, R. and Sra, S.. Matrix manifold optimization for Gaussian mixtures. In Cortes, C., Lawrence, N. D., D. Lee, D., Sugiyama, M., and R. Garnett, editors, Advances in Neural Information Processing Systems 28, pages 910918. Curran Associates, Inc., 2015. [see pp. 14 and 298]Google Scholar
[HS18] Heidel, G. and Schulz, V.. A Riemannian trust-region method for low-rank tensor completion. Numerical Linear Algebra with Applications, 25(6):e2175, 2018. [see p. 174]CrossRefGoogle Scholar
[HS19] Hosseini, R. and Sra, S.. An alternative to EM for Gaussian mixture models: batch and stochastic Riemannian optimization. Mathematical Programming, pages 137, 2019. [see p. 298]Google Scholar
[HS20] Hosseini, R. and Sra, S.. Recent Advances in Stochastic Riemannian Optimization, pages 527554. Springer International Publishing, 2020. [see p. 78]Google Scholar
[HU17] Hosseini, S. and Uschmajew, A.. A Riemannian gradient sampling algorithm for nonsmooth optimization on manifolds. SIAM Journal on Optimization, 27(1):173189, 2017. [see p. 78]CrossRefGoogle Scholar
[Hua13] Huang, W.. Optimization algorithms on Riemannian manifolds with applications. PhD thesis, Florida State University, 2013. [see p. 320]Google Scholar
[HUL01] Hiriart-Urruty, J.-B. and Lemaréchal, C.. Fundamentals of Convex Analysis. Grundlehren Text Editions. Springer-Verlag, 1st edition, 2001. [see pp. 298, 299, 320]Google Scholar
[JBAS10] Journée, M., Bach, F., Absil, P.-A., and Sepulchre, R.. Low-rank optimization on the cone of positive semidefinite matrices. SIAM Journal on Optimization, 20(5):23272351, 2010. [see pp. 15 and 175]Google Scholar
[JD15] Jiang, B. and Dai, Y.-H.. A framework of constraint preserving update schemes for optimization on Stiefel manifold. Mathematical Programming, 153(2):535575, 2015. [see p. 156]Google Scholar
[JMM19] Jawanpuria, P., Meghwanshi, M., and Mishra, B.. Low-rank approximations of hyperbolic embeddings. 2019 IEEE 58th Conference on Decision and Control (CDC), December 2019. [see p. 175]Google Scholar
[JNRS10] Journée, M., Nesterov, Y., Richtárik, P., and Sepulchre, R.. Generalized power method for sparse principal component analysis. The Journal of Machine Learning Research, 11:517553, 2010. [see p. 9]Google Scholar
[KGB16] Kovnatsky, A., Glashoff, K., and Bronstein, M.M.. MADMM: A Generic Algorithm for Non-smooth Optimization on Manifolds, pages 680696. Springer International Publishing, Cham, 2016. [see p. 78]Google Scholar
[KMU+20] Khrulkov, V., Mirvakhabova, L., Ustinova, E., Oseledets, I., and Lempitsky, V.. Hyperbolic image embeddings. In IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2020. [see p. 175]Google Scholar
[KN63] Kobayashi, S. and Nomizu, K.. Foundations of Differential Geometry, Vol. I. John Wiley and Sons, 1963. [see p. 250]Google Scholar
[KNS16] Karimi, H., Nutini, J., and Schmidt, M.. Linear convergence of gradient and proximal-gradient methods under the Polyak–Łojasiewicz condition. In Machine Learning and Knowledge Discovery in Databases (ECML PKDD), pages 795811. Springer International Publishing, 2016. [see p. 320]CrossRefGoogle Scholar
[KSM18] Kasai, H., Sato, H., and Mishra, B.. Riemannian stochastic recursive gradient algorithm. In Dy, J. and Krause, A., editors, Proceedings of the 35th International Conference on Machine Learning, volume 80 of Proceedings of Machine Learning Research, pages 25162524, Stockholmsmassan, Stockholm Sweden, 1015 Jul 2018. PMLR. [see pp. 78 and 320]Google Scholar
[KSV14] Kressner, D., Steinlechner, M., and Vandereycken, B.. Low-rank tensor completion by Riemannian optimization. BIT Numerical Mathematics, 54(2):447468, June 2014. [see p. 174]Google Scholar
[Lag07] Lageman, C.. Pointwise convergence of gradient-like systems. Mathematische Nachrichten, 280(13–14):1543–1558, October 2007. [see p. 78]Google Scholar
[LB20] Liu, C. and Boumal, N.. Simple algorithms for optimization on Riemannian manifolds with constraints. Applied Mathematics and Optimization, 82(3):949981, 2020. [see pp. 175 and 294]CrossRefGoogle Scholar
[LC20] Lezcano-Casado, M.. Curvature-dependant global convergence rates for optimization on manifolds of bounded geometry. arXiv preprint arXiv:2008.02517, 2020. [see pp. 294 and 295]Google Scholar
[Lee12] Lee, J.M. Introduction to Smooth Manifolds, volume 218 of Graduate Texts in Mathematics. Springer-Verlag New York, 2nd edition, 2012. [see pp. xiv, 20, 26, 48, 49, 50, 63, 94, 95, 112, 113, 183, 184, 185, 191, 192, 200, 201, 202, 203, 204, 210, 211, 214, 215, 216, 246, 247, 248, 250, 251, 261, 287, 292, 295]Google Scholar
[Lee18] Lee, J.M. Introduction to Riemannian Manifolds, volume 176 of Graduate Texts in Mathematics. Springer, 2nd edition, 2018. [see pp. xiv, 48, 50, 113, 114, 169, 202, 203, 204, 247, 252, 253, 254, 255, 256, 257, 259, 261, 262, 263, 264, 288, 289, 292, 293, 294, 295, 296, 306, 307, 320]Google Scholar
[Lev20] Levin, E.. Towards optimization on varieties. Undergraduate senior thesis, Princeton University, 2020. [see p. 174]Google Scholar
[Lic79] Lichnewsky, A.. Une méthode de gradient conjugué sur des variétés: Application à certains problèmes de valeurs propres non linéaires. Numerical Functional Analysis and Optimization, 1(5):515560, 1979. [see pp. 2 and 77]Google Scholar
[LKB22] Levin, E., Kileel, J., and Boumal, N.. Finding stationary points on bounded-rank matrices: A geometric hurdle and a smooth remedy. Mathematical Programming, 2022. [see pp. 13, 147, 175, 275]CrossRefGoogle Scholar
[LLY20] Lai, Z., Lim, L.-H., and Ye, K.. Simpler Grassmannian optimization. arXiv preprint arXiv:2009.13502, 2020. [see p. 245]Google Scholar
[Łoj65] Łojasiewicz, S.. Ensembles semi-analytiques. Lecture Notes IHES (Bures-surYvette), 1965. [see p. 78]Google Scholar
[LTW22] Li, S., Tang, G., and Wakin, M.B.. Landscape correspondence of empirical and population risks in the eigendecomposition problem. IEEE Transactions on Signal Processing, 70:29852999, 2022. [see p. 250]Google Scholar
[Lue72] Luenberger, D.G. The gradient projection method along geodesics. Management Science, 18(11):620631, 1972. [see pp. 2, 77, 174]CrossRefGoogle Scholar
[MA20] Massart, E. and Absil, P.-A.. Quotient geometry with simple geodesics for the manifold of fixed-rank positive-semidefinite matrices. SIAM Journal on Matrix Analysis and Applications, 41(1):171198, 2020. [see pp. 175 and 242]CrossRefGoogle Scholar
[Man02] Manton, J.H. Optimization algorithms exploiting unitary constraints. IEEE Transactions on Signal Processing, 50(3):635650, March 2002. [see p. 146]Google Scholar
[Mat96] Mathias, R.. A chain rule for matrix functions and applications. SIAM Journal on Matrix Analysis and Applications, 17(3):610620, 1996. [see p. 71]CrossRefGoogle Scholar
[Mey11] Meyer, G.. Geometric optimization algorithms for linear regression on fixed-rank matrices. PhD thesis, Université de Liège, Belgique, 2011. [see p. 13]Google Scholar
[Mic08] Michor, P.W. Topics in Differential Geometry, volume 93. American Mathematical Society, 2008. [see p. 250]CrossRefGoogle Scholar
[Mis14] Mishra, B.. A Riemannian approach to large-scale constrained least-squares with symmetries. PhD thesis, Université de Liège, Belgique, 2014. [see p. 13]Google Scholar
[MMP18] Malagò, L., Montrucchio, L., and Pistone, G.. Wasserstein riemannian geometry of positive definite matrices. arXiv 1801.09269, 2018. [see p. 320]Google Scholar
[Moa03] Moakher, M.. Means and averaging in the group of rotations. SIAM Journal on Matrix Analysis and Applications, 24(1):116, 2003. [see p. 298]CrossRefGoogle Scholar
[Moa05] Moakher, M.. A differential geometric approach to the geometric mean of symmetric positive-definite matrices. SIAM J. Matrix Anal. Appl., 26(3):735747, March 2005. [see pp. 298 and 319]Google Scholar
[MR20] Martínez-Rubio, D.. Global Riemannian acceleration in hyperbolic and spherical spaces. arXiv preprint arXiv:2012.03618, 2020. [see p. 320]Google Scholar
[MS85] Machado, A. and Salavessa, I.. Grassman manifolds as subsets of Euclidean spaces. In Cordero, L. A., editor, Differential Geometry, Proc. 5th Int. Colloq., volume 131 of Research Notes in Mathematics, pages 85102. Pitman, 1985. [see p. 245]Google Scholar
[MS16] Mishra, B. and Sepulchre, R.. Riemannian preconditioning. SIAM Journal on Optimization, 26(1):635660, 2016. [see p. 141]Google Scholar
[MS20] Marsland, S. and Sommer, S.. Riemannian geometry on shapes and diffeomorphisms: statistics via actions of the diffeomorphism group. In Pennec, X., Sommer, S., and Fletcher, T., editors, Riemannian Geometric Statistics in Medical Image Analysis, pages 135167. Academic Press, 2020. [see p. 175]Google Scholar
[MT11] McCoy, M. and Tropp, J.A.. Two proposals for robust PCA using semidefinite programming. Electronic Journal of Statistics, 5:11231160, 2011. [see p. 9]Google Scholar
[MV13] Mishra, B. and Vandereycken, B.. A Riemannian approach to low-rank algebraic Riccati equations. arXiv preprint arXiv:1312.4883, 2013. [see p. 13]Google Scholar
[MZL19] Maunu, T., Zhang, T., and Lerman, G.. A well-tempered landscape for non-convex robust subspace recovery. Journal of Machine Learning Research, 20(37):159, 2019. [see p. 9]Google Scholar
[Nes18] Nesterov, Y.. Lectures on Convex Optimization, volume 137 of Optimization and Its Applications. Springer, 2nd edition, 2018. [see p. 14]Google Scholar
[NK17] Nickel, M. and Kiela, D.. Poincaré embeddings for learning hierarchical representations. In Guyon, I., Luxburg, U. V., Bengio, S., Wallach, H., Fergus, R., Vishwanathan, S., and Garnett, R., editors, Advances in Neural Information Processing Systems 30, pages 63386347. Curran Associates, Inc., 2017. [see p. 175]Google Scholar
[NNSS20] Neumayer, S., Nimmer, M., Setzer, S., and Steidl, G.. On the rotational invariant L1-norm PCA. Linear Algebra and Its Applications, 587:243270, February 2020. [see p. 9]Google Scholar
[NO61] Nomizu, K. and Ozeki, H.. The existence of complete Riemannian metrics. Proceedings of the American Mathematical Society, 12(6):889891, 1961. [see p. 293]Google Scholar
[Nof17] Noferini, V.. A formula for the Fréchet derivative of a generalized matrix function. SIAM Journal on Matrix Analysis and Applications, 38(2):434457, 2017. [see p. 71]Google Scholar
[NP06] Nesterov, Y. and Polyak, B.T.. Cubic regularization of Newton method and its global performance. Mathematical Programming, 108(1):177205, 2006. [see p. 148]Google Scholar
[NSAY+19] Nguyen, V.A., Shafieezadeh-Abadeh, S., Yue, M.-C., Kuhn, D., and Wiesemann, W.. Calculating optimistic likelihoods using (geodesically) convex optimization. In Wallach, H., Larochelle, H., Beygelzimer, A., d’Alché Buc, F., Fox, E., and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 32. Curran Associates, Inc., 2019. [see pp. 298 and 319]Google Scholar
[NW06] Nocedal, J. and Wright, S.. Numerical Optimization. Springer Series in Operations Research and Financial Engineering. Springer Science & Business Media, 2nd edition, 2006. [see pp. 61, 117, 140, 147]Google Scholar
[O’N83] O’Neill, B.. Semi-Riemannian Geometry: With Applications to Relativity, volume 103. Academic Press, 1983. [see pp. xiv, 48, 50, 113, 114, 170, 203, 225, 247, 294]Google Scholar
[Pat98] Pataki, G.. On the rank of extreme matrices in semidefinite programs and the multiplicity of optimal eigenvalues. Mathematics of Operations Research, 23(2):339358, 1998. [see p. 14]Google Scholar
[Pea94] Pearlmutter, B.A. Fast exact multiplication by the Hessian. Neural Computation, 6:147160, 1994. [see p. 75]Google Scholar
[Pet76] Peterson, E. L.. Geometric programming. SIAM Review, 18(1):151, January 1976. [see p. 316]CrossRefGoogle Scholar
[Pet06] Petersen, P.. Riemannian Geometry, volume 171 of Graduate Texts in Mathematics. Springer, 2nd edition, 2006. [see p. 293]Google Scholar
[PJB18] Pumir, T., Jelassi, S., and Boumal, N.. Smoothed analysis of the low-rank approach for smooth semidefinite programs. In Bengio, S., Wallach, H., Larochelle, H., Grauman, K., Cesa-Bianchi, N., and Garnett, R., editors, Advances in Neural Information Processing Systems 31, pages 22832292. Curran Associates, Inc., 2018. [see p. 15]Google Scholar
[Pol63] Polyak, B.T. Gradient methods for the minimisation of functionals. USSR Computational Mathematics and Mathematical Physics, 3(4):864878, January 1963. [see p. 320]Google Scholar
[QGA10a] Qi, C., Gallivan, K.A., and Absil, P.-A. An efficient BFGS algorithm for Riemannian optimization. In Proceedings of the 19th International Symposium on Mathematical Theory of Network and Systems (MTNS 2010), volume 1, pages 22212227, 2010. [see pp. 265 and 295]Google Scholar
[QGA10b] Qi, C., Gallivan, K.A., and Absil, P.A.. Riemannian bfgs algorithm with applications. Recent Advances in Optimization and Its Applications in Engineering, pages 183192, 2010. [see p. 78]CrossRefGoogle Scholar
[Qi11] Qi, C.. Numerical optimization methods on Riemannian manifolds. PhD thesis, Florida State University, 2011. [see p. 148]Google Scholar
[QO04] Quiroz, E.A. and Oliveira, P.R.. New results on linear optimization through diagonal metrics and Riemannian geometry tools. Technical report, ES-654/04, PESC COPPE, Federal University of Rio de Janeiro, 2004. [see p. 320]Google Scholar
[Rap91] Rapcsák, T.. Geodesic convexity in nonlinear optimization. Journal of Optimization Theory and Applications, 69(1):169183, April 1991. [see pp. 305 and 319]Google Scholar
[Rap97] Rapcsák, T.. Smooth Nonlinear Optimization in Rn, volume 19 of Nonconvex Optimization and Its Applications. Springer, 1997. [see pp. 2, 77, 298, 303, 304, 316, 319]Google Scholar
[RDTEL21] Rosen, D.M., Doherty, K.J., Terán Espinoza, A., and Leonard, J.J.. Advances in inference and representation for simultaneous localization and mapping. Annual Review of Control, Robotics, and Autonomous Systems, 4:215242, 2021. [see p. 12]CrossRefGoogle Scholar
[Roc70] Rockafellar, R.T. Convex Analysis. Princeton University Press, 1970. [see pp. 298, 304, 320]Google Scholar
[RW12] Ring, W. and Wirth, B.. Optimization methods on Riemannian manifolds and their application to shape space. SIAM Journal on Optimization, 22(2):596627, 2012. [see pp. 78 and 294]Google Scholar
[Sak96] Sakai, T.. Riemannian Geometry, volume 149. American Mathematical Society, 1996. [see pp. 306 and 320]Google Scholar
[Sat16] Sato, H.. A Dai–Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions. Computational Optimization and Applications, 64(1):101118, 2016. [see p. 78]CrossRefGoogle Scholar
[Sat21] Sato, H.. Riemannian Optimization and Its Applications. Springer International Publishing, 2021. [see p. 78]Google Scholar
[Sch66] Schönemann, P.H.. A generalized solution of the orthogonal Procrustes problem. Psychometrika, 31(1):110, March 1966. [see p. 156]Google Scholar
[SFF19] Sun, Y., Flammarion, N., and Fazel, M.. Escaping from saddle points on Riemannian manifolds. In Wallach, H., Larochelle, H., Beygelzimer, A., d’Alché Buc, F., Fox, E., and R. Garnett, editors, Advances in Neural Information Processing Systems 32, pages 72767286. Curran Associates, Inc., 2019. [see p. 294]Google Scholar
[SH15] Sra, S. and Hosseini, R.. Conic geometric optimization on the manifold of positive definite matrices. SIAM Journal on Optimization, 25(1):713739, 2015. [see p. 319]CrossRefGoogle Scholar
[SI14] Sato, H. and Iwai, T.. Optimization algorithms on the Grassmann manifold with application to matrix eigenvalue problems. Japan Journal of Industrial and Applied Mathematics, 31(2):355400, April 2014. [see pp. 245 and 250]Google Scholar
[SI15] Sato, H. and Iwai, T.. A new, globally convergent Riemannian conjugate gradient method. Optimization, 64(4):10111031, 2015. [see p. 78]CrossRefGoogle Scholar
[SKM19] Sato, H., Kasai, H., and Mishra, B.. Riemannian stochastic variance reduced gradient algorithm with retraction and vector transport. SIAM Journal on Optimization, 29(2):14441472, 2019. [see p. 78]Google Scholar
[Smi94] Smith, S.T. Optimization techniques on Riemannian manifolds. Fields Institute Communications, 3(3):113135, 1994. [see pp. 2, 77, 78]Google Scholar
[SN22] Sun, S. and Nocedal, J.. A trust region method for the optimization of noisy functions. arXiv:2201.00973, 2022. [see p. 139]Google Scholar
[SQW17] Sun, J., Qu, Q., and Wright, J.. Complete dictionary recovery over the sphere II: Recovery by Riemannian trust-region method. IEEE Transactions on Information Theory, 63(2):885914, February 2017. [see p. 8]Google Scholar
[Sra16] Sra, S.. On the matrix square root via geometric optimization. Electronic Journal of Linear Algebra, 31(1):433443, July 2016. [see p. 298]Google Scholar
[Ste83] Steihaug, T.. The conjugate gradient method and trust regions in large scale optimization. SIAM Journal on Numerical Analysis, 20(3):626637, 1983. [see p. 147]Google Scholar
[TA21] Tang, T. M. and Allen, G. I.. Integrated principal components analysis. Journal of Machine Learning Research, 22(198):171, 2021. [see p. 298]Google Scholar
[Tak11] Takatsu, A.. Wasserstein geometry of Gaussian measures. Osaka Journal of Mathematics, 48(4):10051026, 2011. [see p. 320]Google Scholar
[TB97] Trefethen, L.N. and Bau, D.. Numerical Linear Algebra. Society for Industrial and Applied Mathematics, 1997. [see pp. xii, 121, 125, 146]Google Scholar
[TD14] Tron, R. and Daniilidis, K.. On the quotient representation for the essential manifold. In 2014 IEEE Conference on Computer Vision and Pattern Recognition. IEEE, June 2014. [see p. 175]Google Scholar
[TFBJ18] Tripuraneni, N., Flammarion, N., Bach, F., and Jordan, M.I.. Averaging stochastic gradient descent on Riemannian manifolds. In Proceedings of The 31st Conference on Learning Theory, COLT, 2018. [see p. 320]Google Scholar
[TG21] Trendafilov, N. and Gallo, M.. Multivariate Data Analysis on Matrix Manifolds (with Manopt). Springer International Publishing, 2021. [see p. 11]CrossRefGoogle Scholar
[TKW16] Townsend, J., Koep, N., and Weichwald, S.. PyManopt: a Python toolbox for optimization on manifolds using automatic differentiation. Journal of Machine Learning Research, 17(137):15, 2016. [see p. 149]Google Scholar
[Toi81] Toint, P.. Towards an efficient sparsity exploiting Newton method for minimization. In Duff, I. S., editor, Sparse Matrices and Their Uses, pages 5788. Academic Press, 1981. [see p. 147]Google Scholar
[Udr94] Udris¸te, C.. Convex Functions and Optimization Methods on Riemannian Manifolds, volume 297 of Mathematics and its applications. Kluwer Academic Publishers, 1994. [see pp. 2, 77, 298, 304, 306, 319]Google Scholar
[UV13] Uschmajew, A. and Vandereycken, B.. The geometry of algorithms using hierarchical tensors. Linear Algebra and Its Applications, 439(1):133166, July 2013. [see p. 174]CrossRefGoogle Scholar
[UV20] Uschmajew, A. and Vandereycken, B.. Geometric methods on low-rank matrix and tensor manifolds. In Handbook of Variational Methods for Nonlinear Geometric Data, pages 261313. Springer International Publishing, 2020. [see p. 174]Google Scholar
[Van10] Vandereycken, B.. Riemannian and multilevel optimization for rank-constrained matrix problems. Faculty of Engineering, Katholieke Universiteit Leuven, 2010. [see p. 13]Google Scholar
[Van13] Vandereycken, B.. Low-rank matrix completion by Riemannian optimization. SIAM Journal on Optimization, 23(2):12141236, 2013. [see pp. 13 and 174]Google Scholar
[Vav91] Vavasis, S.A. Nonlinear Optimization: Complexity Issues. Oxford University Press, Inc., 1991. [see p. 141]Google Scholar
[VAV09] Vandereycken, B., Absil, P.-A., and Vandewalle, S.. Embedded geometry of the set of symmetric positive semidefinite matrices of fixed rank. In SSP09, pages 389392. IEEE, 2009. [see pp. 175 and 242]Google Scholar
[Vis18] Vishnoi, N.K. Geodesic convex optimization: differentiation on manifolds, geodesics, and convexity. arXiv 1806.06373, 2018. [see pp. 298, 309, 318]Google Scholar
[Wie12] Wiesel, A.. Geodesic convexity and covariance estimation. IEEE Transactions on Signal Processing, 60(12):61826189, December 2012. [see p. 298]Google Scholar
[WW20] Waldspurger, I. and Waters, A.. Rank optimality for the Burer–Monteiro factorization. SIAM Journal on Optimization, 30(3):25772602, 2020. [see p. 15]CrossRefGoogle Scholar
[WY13] Wen, Z. and Yin, W.. A feasible method for optimization with orthogonality constraints. Mathematical Programming, 142(1–2):397–434, 2013. [see p. 156]CrossRefGoogle Scholar
[YZS14] Yang, W.H., Zhang, L.-H., and Song, R.. Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization, 10(2):415434, 2014. [see pp. 78 and 146]Google Scholar
[ZHS16] Zadeh, P.H., Hosseini, R., and Sra, S.. Geometric mean metric learning. In Proceedings of the 33rd International Conference on International Conference on Machine Learning, ICML, pages 24642471. JMLR.org, 2016. [see p. 298]Google Scholar
[ZRS16] Zhang, H., Reddi, S.J., and Sra, S.. Riemannian SVRG: Fast stochastic optimization on Riemannian manifolds. In Lee, D. D., Sugiyama, M., Luxburg, U. V., Guyon, I., and Garnett, R., editors, Advances in Neural Information Processing Systems 29, pages 45924600. Curran Associates, Inc., 2016. [see p. 78]Google Scholar
[ZS16] Zhang, H. and Sra, S.. First-order methods for geodesically convex optimization. In Conference on Learning Theory, pages 16171638. PMLR, 2016. [see pp. 78, 298, 319, 320]Google Scholar
[ZS18] Zhang, H. and Sra, S.. An estimate sequence for geodesically convex optimization. In Bubeck, S., Perchet, V., and P. Rigollet, editors, Proceedings of the 31st Conference On Learning Theory, volume 75 of Proceedings of Machine Learning Research, pages 17031723. PMLR, July 69, 2018. [see p. 320]Google Scholar
[ZZ18] Zhang, J. and Zhang, S.. A cubic regularized Newton’s method over Riemannian manifolds. arXiv preprint arXiv:1805.05565, 2018. [see p. 148]Google Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

  • References
  • Nicolas Boumal, École Polytechnique Fédérale de Lausanne
  • Book: An Introduction to Optimization on Smooth Manifolds
  • Online publication: 09 March 2023
Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

  • References
  • Nicolas Boumal, École Polytechnique Fédérale de Lausanne
  • Book: An Introduction to Optimization on Smooth Manifolds
  • Online publication: 09 March 2023
Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

  • References
  • Nicolas Boumal, École Polytechnique Fédérale de Lausanne
  • Book: An Introduction to Optimization on Smooth Manifolds
  • Online publication: 09 March 2023
Available formats
×