Publications

  • G. M. Chari, U. Jang , E. K. Ryu , B. Açıkmeşe, Optimal Acceleration for Proximal Minimization of the Sum of Convex and Strongly Convex Functions. Preprint, 2026. [Link]

  • U. Jang, K. Sun, W. Yin, and E. K. Ryu, ALiA: Adaptive Linearized ADMM. Preprint, 2026. [Link]

  • U. Jang and E. K. Ryu. Point convergence of Nesterov’s accelerated gradient method: An AI-assisted proof. Preprint, 2025. [Link]

  • C. Park, U. Jang, E. K. Ryu, and I. Yang. Sharpness-Aware Minimization Can Hallucinate Minimizers. International Conference on Machine Learning, 2026. [Link]

  • U. Jang, S. Das Gupta, and E. K. Ryu. Computer-Assisted Design of Accelerated Composite Optimization Methods: OptISTA. Mathematical Programming, 2025. [Link]

  • U. Jang, J. D. Lee, and E. K. Ryu. LoRA Training in the NTK Regime has No Spurious Local Minima. International Conference on Machine Learning (Oral, top 144/9473=1.5% of papers), 2024. [Link]

  • U.Jang (published as E. Chang), J. Kim, H. Kwak, H.H. Lee, and S.G. Youn. Irreducibly SU(2)-covariant quantum channels of low rank. Reviews in Mathematical Physics, 2022. [Link]