Ph.D. student, |
I am a fifth year Ph.D. candidate at the Department of Computer Sciences, University of Wisconsin-Madison and am fortunate to be advised by Prof. Jelena Diakonikolas. I am primarily interested in the area of large-scale optimization methods and their applications to machine learning. My current research work aims to design novel optimization algorithms with solid theoretical guarantees and good practical performance.
Before moving to Madison, I completed my M.S. degree in Computer Science at Boston Unversity in 2019, where I had the great opportunity to work on convex optimization with Prof. Lorenzo Orecchia and Jelena. Prior to that, I obtained my B.A. degree in Mathematics at the University of Cambridge in 2015, supervised by Professor Christopher Tout and Dr Paul Russell.
Sep 2024: Shuffled SGD is accepted to NeurIPS’24!
Apr 2023: ACODER is accepted to ICML’23!
Sep 2022: CLVR is accepted to NeurIPS’22!
May 2022: I presented our work (CLVR) during 2022 Optimization Days at HEC Montréal.
May 2021: Parameter-free Locally Accelerated Conditional Gradients is accepted to ICML’21!
Applied Scientist Intern | Machine Intelligence and Decision Analytics for Search, Amazon
May 2021 - Aug 2021
Berkeley, CA, United States
Worked on a deep learning research project to improve the quality of Amazon's search query suggestions.
Software Development Engineer Intern | Machine Intelligence and Decision Analytics for Search, Amazon
Mar 2019 - Jan 2020
Berkeley, CA, United States
Worked on the machine learning infrastructure for customer search query understanding and suggestion.
(* denotes equal contribution, α-β denotes alphabetical ordering)
X. Cai*, C-Y. Lin*, J. Diakonikolas, “Empirical Risk Minimization with Shuffled SGD: A Primal-Dual Perspective and Improved Bounds,” accepted to NeurIPS’24, 2024. [arXiv]
C-Y. Lin, C. Song, J. Diakonikolas, “Accelerated Cyclic Coordinate Dual Averaging with Extrapolation for Composite Convex Optimization,” in Proc. ICML’23, 2023. [arXiv]
C. Song*, C-Y. Lin*, S. Wright, J. Diakonikolas, “Coordinate Linear Variance Reduction for Generalized Linear Programming,” in Proc. NeurIPS’22, 2022. [arXiv] [Talk] [Poster]
(α-β) A. Carderera, J. Diakonikolas, C. Y. Lin, S. Pokutta, “Parameter-free Locally Accelerated Conditional Gradients,” in Proc. ICML’21, 2021. [arXiv] [Talk] [Slides]
2023 Spring
2022 Spring
2020 Spring
2022 Spring
CS 839 Theoretical Foundations of Deep Learning (Prof. Yingyu Liang)
CS 787 Advanced Algorithms (Prof. Christos Tzamos)
2021 Fall
Math 888 High-Dimensional Probability and Statistics (Prof. Sebastien Roch)
2021 Spring
CS 525 Linear Optimization (Prof. Alberto Del Pia)
CS 728 Integer Optimization (Prof. Jim Luedtke)
2020 Fall
CS 524 Introduction to Optimization (Prof. Michael Ferris)
CS 727 Convex Analysis (Prof. Stephen M. Robinson)
2020 Spring
CS 726 Nonlinear Optimization (Prof. Jelena Diakonikolas)
CS 761 Mathematical Foundations of Machine Learning (Prof. Robert Nowak & Prof. Kangwook Lee)