I am an assistant professor at the Computer Science Department at the University of Iowa since August 2014. I was a researcher at NEC Laboratories America, Inc. Before that, I was a Machine Learning Researcher at
GE Global Research. I received my Ph.D. degree in Computer Science from Michigan State University in 2012.
Here is my Google Scholar Citations and CV.
- Mingrui Liu (Phd student, joined the group in Fall 2016)
- Xin Man (Phd student, joined the group in Fall 2016)
- Dixian Zhu (PhD student, joined the group in 2018)
- Zhuoning Yuan (PhD student, joined the group 2018)
- Qi Qi (PhD student, joined the group 2018)
- Zhishuai Guo (PhD student, joined the group 2018)
- Yan Yan (Visiting student: 02/2016 - 07/2016, Postdoc Research Associate: 09/2018 - )
- Zaiyi Chen (Visiting student, 09/2016 - 09/2017)
- Yi Xu (PhD 2019), Accelerate Convex Optimization in Machine Learning by Leveraging Structural Conditions. First Appointment: Alibaba Group DAMO Academy
- Zhe Li (PhD 2018), Optimizing Neural Network Structures: Faster Speed, Smaller Size, Less Tuning. First Appointment: Apple Inc.
- Xiaoxuan Zhang (PhD 2018), Online Learning for Imbalanced Data: Optimizing Asymmetric Measures. First Appointment: Ancestry Inc.
I am interested in machine learning and optimization and its applications to big data analytics.
My current research topics include:
Some old research topics during PhD studies include:
- Non-Convex Optimization Algorithms
To develope provable and practical optimization algorithms for solving non-convex problems in machine learning
[Unified Momentum] , [NEON] , [Adaptive NCD], [Stagewise Optimization for DL] , [Stagewise Katyusha] , [Non-Convex Concave Min-Max] , [Non-Convex Non-Concave Min-Max]
- Faster Convergent Algorithms by Leveraging Error Bound Condition
To develope faster convergent optimization algorithms by leveraging local and global error bound conditions
[RSG] , [HOPS] , [ASSG], [Adaptive Admm] , [adaSVRG] , [adaAGC] , [SadaGrad] , [LogProj] , [ada-Frank-Wolfe]
- Large-scale Stochastic Optimization
To develope efficient algorithms for stochastic optimization for learning from large-scale high-dimensional data.
- Online Optimization
To develope new online convex optimization algorithms.
To understand, design and optimize deep neural networks for large-scale image classification.
See demo on flower, food and vehicle recognition, collaborating with NEC Labs America.
Randomized Algorithms for Big Data Analytics
Developing randomized algorithms for solving big data machine learning problems. See here.
Distributed Optimization for Big Data
Developed a practical distributed optimization for solving big data classification and regression problems. See Software.
Social Network Analysis
Developed algorithms for detecting communities and their dynamic evolutions in Social Networks.
Learning from Noisy Labels
Developed algorithms and theories for learning from Noisy labels.
Multiple Kernel Learning
Developed efficient and robust algorithms for multiple kernel learning.
- Improved Bounds for the Nystrom method
Developed several improved bounds for the Nystrom method under larger eigengap condition and power law eigenvalue distribution; and the theory of their
applications to large-scale kernel learning.