Peter Richtarik
Peter Richtarik is a Slovak mathematician and computer scientist[1] working in the area of big data optimization and machine learning, known for his work on randomized coordinate descent algorithms, stochastic gradient descent and federated learning. He is currently a Professor of Computer Science at the King Abdullah University of Science and Technology.
Peter Richtarik | |
---|---|
Born | |
Nationality | Slovak |
Alma mater | Comenius University Cornell University |
Scientific career | |
Fields | Mathematics, Computer Science, Machine Learning |
Institutions | KAUST |
Thesis | Some algorithms for large-scale convex and linear minimization in relative scale (2007) |
Academic advisors | Yurii Nesterov |
Website | https://richtarik.org |
Education
Richtarik earned a master's degree in mathematics from Comenius University, Slovakia, in 2001, graduating summa cum laude.[2] In 2007, he obtained a PhD in operations research from Cornell University, advised by Michael Jeremy Todd.[3][4]
Career
Between 2007 and 2009, he was a postdoctoral scholar in the Center for Operations Research and Econometrics and Department of Mathematical Engineering at Universite catholique de Louvain, Belgium, working with Yurii Nesterov.[5][6] Between 2009 and 2019, Richtarik was a Lecturer and later Reader in the School of Mathematics at the University of Edinburgh. He is a Turing Fellow.[7] Richtarik founded and organizes a conference series entitled "Optimization and Big Data".[8][9]
Academic work
Richtarik's early research concerned gradient-type methods, optimization in relative scale, sparse principal component analysis and algorithms for optimal design. Since his appointment at Edinburgh, he has been working extensively on building algorithmic foundations of randomized methods in convex optimization, especially randomized coordinate descent algorithms and stochastic gradient descent methods. These methods are well suited for optimization problems described by big data and have applications in fields such as machine learning, signal processing and data science.[10][11] Richtarik is the co-inventor of an algorithm generalizing the randomized Kaczmarz method for solving a system of linear equations, contributed to the invention of federated learning, and co-developed a stochastic variant of the Newton's method.
Awards and distinctions
- 2020, Due to his Hirsch index of 40 or more,[12] he belongs among top 0.05% of computer scientists.[13]
- 2016, SIGEST Award (jointly with Olivier Fercoq)[14] of the Society for Industrial and Applied Mathematics
- 2016, EPSRC Early Career Fellowship in Mathematical Sciences[15]
- 2015, EUSA Best Research or Dissertation Supervisor Award (2nd place)[16]
- 2014, Plenary Talk at 46th Conference of Slovak Mathematicians[17]
Bibliography
- Peter Richtarik & Martin Takac (2012). "Efficient serial and parallel coordinate descent methods for huge-scale truss topology design". Operations Research Proceedings 2011. Operations Research Proceedings. Springer-Verlag. pp. 27–32. doi:10.1007/978-3-642-29210-1_5. ISBN 978-3-642-29209-5.
- Peter Richtarik & Martin Takac (2014). "Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function". Mathematical Programming. Springer. 144 (1): 1–38. arXiv:1107.2848. doi:10.1007/s10107-012-0614-z. S2CID 254137101.
- Olivier Fercoq & Peter Richtarik (2015). "Accelerated, parallel and proximal coordinate descent". SIAM Journal on Optimization. 25 (4): 1997–2023. arXiv:1312.5799. doi:10.1137/130949993. S2CID 8068556.
- Dominik Csiba; Zheng Qu; Peter Richtarik (2015). "Stochastic Dual Coordinate Ascent with Adaptive Probabilities" (pdf). Proceedings of the 32nd International Conference on Machine Learning. pp. 674–683.
- Robert M Gower & Peter Richtarik (2015). "Randomized Iterative Methods for Linear Systems". SIAM Journal on Matrix Analysis and Applications. 36 (4): 1660–1690. doi:10.1137/15M1025487. hdl:20.500.11820/5c673b9e-8cf3-482c-8602-da8abcb903dd. S2CID 8215294.
- Peter Richtarik & Martin Takac (2016). "Parallel coordinate descent methods for big data optimization". Mathematical Programming. 156 (1): 433–484. doi:10.1007/s10107-015-0901-6. hdl:20.500.11820/a5649cad-b6b8-4ccc-9ca2-b368131dcbe5. S2CID 254133277.
- Zheng Qu & Peter Richtarik (2016). "Coordinate descent with arbitrary sampling I: algorithms and complexity". Optimization Methods and Software. 31 (5): 829–857. arXiv:1412.8060. doi:10.1080/10556788.2016.1190360. S2CID 2636844.
- Zheng Qu & Peter Richtarik (2016). "Coordinate descent with arbitrary sampling II: expected separable overapproximation". Optimization Methods and Software. 31 (5): 858–884. arXiv:1412.8063. doi:10.1080/10556788.2016.1190361. S2CID 11048560.
- Zheng Qu; Peter Richtarik; Martin Takac; Olivier Fercoq (2016). "SDNA: Stochastic Dual Newton Ascent for Empirical Risk Minimization" (pdf). Proceedings of the 33rd International Conference on Machine Learning. pp. 1823–1832.
- Zeyuan Allen-Zhu; Zheng Qu; Peter Richtarik; Yang Yuan (2016). "Even faster accelerated coordinate descent using non-uniform sampling" (pdf). Proceedings of the 33rd International Conference on Machine Learning. pp. 1110–1119.
- Dominik Csiba & Peter Richtarik (2016). "Importance sampling for minibatches". arXiv:1602.02283 [cs.LG].
- Dominik Csiba & Peter Richtarik (2016). "Coordinate descent face-off: primal or dual?". arXiv:1605.08982 [math.OC].
References
- "Richtarik's DBLP profile". Retrieved December 23, 2020.
- "Richtarik's CV" (PDF). Retrieved August 21, 2016.
- "Mathematics Genealogy Project". Retrieved August 20, 2016.
- "Cornell PhD Thesis". Retrieved August 22, 2016.
- "Postdoctoral Fellows at CORE". Retrieved August 22, 2016.
- "Simons Institute for the Theory of Computing, UC Berkeley". Retrieved August 22, 2016.
- "Alan Turing Institute Faculty Fellows". Retrieved August 22, 2016.
- "Optimization and Big Data 2012". Retrieved August 20, 2016.
- "Optimization and Big Data 2015". Retrieved August 20, 2016.
- Cathy O'Neil & Rachel Schutt (2013). "Modeling and Algorithms at Scale". Doing Data Science: Straight Talk from the Frontline. O'Reilly. ISBN 9781449358655. Retrieved August 21, 2016.
- Sebastien Bubeck (2015). Convex Optimization: Algorithms and Complexity. Foundations and Trends in Machine Learning. Now Publishers. ISBN 978-1601988607.
- "Google Scholar". Retrieved December 28, 2020.
- "The h Index for Computer Science". Retrieved December 28, 2020.
- "SIGEST Award". Retrieved August 20, 2016.
- "EPSRC Fellowship". Retrieved August 21, 2016.
- "EUSA Awards 2015". Retrieved August 20, 2016.
- "46th Conference of Slovak Mathematicians". Retrieved August 22, 2016.