Hello and Welcome! I am a final-year PhD student in Electrical Engineering at Stanford advised by Emmanuel Candès, with research interests in optimization, signal processing, statistics and decision making in social-dynamic environments. Earlier I spent four wonderful years at Rice University, graduated with two bachelors (in Electrical Engineering + Statistics) with a minor in Applied Math in 2015 and a lovely summer at EPFL before grad school.
- Subgradient Descent Learns Orthogonal Dictionaries, ICLR '19. with Yu Bai, Ju Sun
- Near-Optimal Method For Highly Smooth Convex Optimization, COLT '19. with Sébastien Bubeck, Yin Tat Lee, Yuanzhi Li, Aaron Sidford
- Complexity of Highly Parallel Non-Smooth Convex Optimization, NeurIPS '19 (Spotlight Presentation). with Sébastien Bubeck, Yin Tat Lee, Yuanzhi Li, Aaron Sidford
- Optimizing Black-box Metrics with Adaptive Surrogates, ICML '20 (work done while intern@Google Research). with Oliver Adigun, Hari Narasimhan, Mahdi Milani Fard, Maya Gupta
- Acceleration with a Ball Optimization Oracle, NeurIPS '20 (Oral Presentation). with Yair Carmon, Arun Jambulapati, Yujia Jin, Yin Tat Lee, Aaron Sidford, Kevin Tian
- Learning the Truth from Only One Side of the Story, AISTATS '21. with Heinrich Jiang, Aldo Pacchiano
- Randomized Alternating Direction Methods for Efficient Distributed Optimization. with Emmanuel Candès, Mert Pilanci
- Mirror Langevin Monte Carlo: the Case Under Isoperimetry.
Teaching & Mentorship
I had fun TAing for High-dimensional Statistics, Convex Optimization, Machine Learning classes in the past, and serving as a technical mentor for the Data Science for Social Good program at Stanford. I quite enjoy applications of statistics and algorithmic tools for answering social science questions.
May I entertain you with Compressed Sensing for People in a Hurry perhaps?
Feel free to email me at qjiang2 AT stanford DOT edu or stop by 380-U. Off research, you will most likely find me lost in books, art or nature.