Yangyang Xu

Assistant Professor, Mathematical Sciences

Dr. Yangyang Xu earned a bachelor degree in Computational Mathematics from Nanjing University, a master degree from the Institute of Applied Mathematics at Chinese Academy of Sciences, and his Ph.D from the Department of Computational and Applied Mathematics at Rice University in 2014. Before joining RPI, Dr. Xu was an assistant professor at University of Alabama. He also spent one year as a postdoctoral fellow at University of Waterloo and another year as an NSF postdoc at University of Minnesota. 

Dr. Xu's broad research interests are optimization theory and methods and their applications such as in machine learning, statistics, and signal processing. He worked on developing algorithms for compressed sensing, matrix completion, and tensor factorization and learning. Recently, his research focuses on first-order methods, operator splitting, stochastic optimization methods, and high performance parallel computing. These works are motivated by very "big" problems arising in machine learning and image processing.


Ph.D, Computational & Applied Mathematics
Rice University, 2014

M.S, Operations Research
Chinese Academy of Sciences, 2010

B.S, Computational Mathematics
Nanjing University, 2007


Research Focus
  • Optimization
  • Parallel computing
  • Compressed sensing, Matrix completion, tensor factorization and learning
Select Works
  • Y. Ouyang and Y. Xu. Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems. Mathematical Programming, Series A, 185, pp. 1-35, 2021.
  • Y. Xu. Primal-dual stochastic gradient method for convex programs with many functional constraints. SIAM Journal on Optimization, 30(2), pp. 1664-1692, 2020.
  • Y. Xu. Hybrid Jacobian and Gauss-Seidel proximal block coordinate update methods for linearly constrained convex programming. SIAM Journal on Optimization, 28(1), pp. 646-670, 2018.
  • Y. Xu. Accelerated first-order primal-dual proximal methods for linearly constrained composite convex programming. SIAM Journal on Optimization, 27(3), 1459-1484, 2017.
  • Z. Peng, Y. Xu, M. Yan and W. Yin. ARock: an algorithmic framework for asynchronous parallel coordinate updates. SIAM Journal on Scientific Computing, 38(5), A2851-A2879, 2016.
  • N. Zhou, Y. Xu, H. Cheng, J. Fang and W. Pedrycz. Global and local structure preserving sparse subspace learning: an iterative approach to unsupervised feature selection. Pattern Recognition, 53, pp. 87-101, 2016.
  • Y. Xu and W. Yin. Block stochastic gradient iteration for convex and nonconvex optimization. SIAM Journal on Optimization, 25(3), 1686-1716, 2015.
  • Y. Xu and W. Yin. A block coordinate descent method for regularized multi-convex optimization with applications to nonnegative tensor factorization and completion. SIAM Journal on imaging sciences, 6(3), pp. 1758-1789, 2013.