Name: Ding Lizhong
Discipline: Computer Science and Technology
Title: Professor
Contact number:
E-mail: 6120220231@bit.edu.cn
Address: Personal Information
Ding Lizhong, professor of Beijing Institute of Technology, doctoral supervisor, national young talent. His research interests include statistical learning methods and theories, statistical hypothesis testing and deep generation models, kernel methods and deep kernel methods, stochastic algorithms and matrix approximation. Published more than 20 papers in NeurIPS, AAAI, TPAMI, TNNLS and other top international conferences and journals, and served as a reviewer for NeurIPS, ICML, ICLR, IJCAI, AAAI and other top conferences. In recent years, research has focused on two aspects: First, it breaks through the existing representation learning understanding of deep learning and establishes a complete and rigorous foundation for deep learning based on classical statistical learning theories. Second, it proposes a deep generation model and deep kernel discrimination model that can analyze and process complex, high-dimensional and large-scale data. And the research results will be applied to the current national digital society, digital people's livelihood in the field of cutting-edge issues to be solved.
Every year, we plan to recruit 1 doctoral student and 2 to 3 master students. At the same time, we also welcome excellent senior undergraduates to join the laboratory. Those with excellent performance will be recommended to Saudi Arabia, UAE and Hong Kong for further study. If you are interested in joining the team, please send your resume to 6120220231@bit.edu.cn
Research Direction
Stochastic Machine Learning Theory and Method, Statistical Learning Method and Theory, Statistical Hypothesis Testing and Deep Generation Model, Kernel method and Deep Kernel Method
Personal Information
Ding Lizhong, professor of Beijing Institute of Technology, doctoral supervisor, national young talent. His research interests include statistical learning methods and theories, statistical hypothesis testing and deep generation models, kernel methods and deep kernel methods, stochastic algorithms and matrix approximation. Published more than 20 papers in NeurIPS, AAAI, TPAMI, TNNLS and other top international conferences and journals, and served as a reviewer for NeurIPS, ICML, ICLR, IJCAI, AAAI and other top conferences. In recent years, research has focused on two aspects: First, it breaks through the existing representation learning understanding of deep learning and establishes a complete and rigorous foundation for deep learning based on classical statistical learning theories. Second, it proposes a deep generation model and deep kernel discrimination model that can analyze and process complex, high-dimensional and large-scale data. And the research results will be applied to the current national digital society, digital people's livelihood in the field of cutting-edge issues to be solved.
Every year, we plan to recruit 1 doctoral student and 2 to 3 master students. At the same time, we also welcome excellent senior undergraduates to join the laboratory. Those with excellent performance will be recommended to Saudi Arabia, UAE and Hong Kong for further study. If you are interested in joining the team, please send your resume to 6120220231@bit.edu.cn
Personal Information
Ding Lizhong, professor of Beijing Institute of Technology, doctoral supervisor, national young talent. His research interests include statistical learning methods and theories, statistical hypothesis testing and deep generation models, kernel methods and deep kernel methods, stochastic algorithms and matrix approximation. Published more than 20 papers in NeurIPS, AAAI, TPAMI, TNNLS and other top international conferences and journals, and served as a reviewer for NeurIPS, ICML, ICLR, IJCAI, AAAI and other top conferences. In recent years, research has focused on two aspects: First, it breaks through the existing representation learning understanding of deep learning and establishes a complete and rigorous foundation for deep learning based on classical statistical learning theories. Second, it proposes a deep generation model and deep kernel discrimination model that can analyze and process complex, high-dimensional and large-scale data. And the research results will be applied to the current national digital society, digital people's livelihood in the field of cutting-edge issues to be solved.
Every year, we plan to recruit 1 doctoral student and 2 to 3 master students. At the same time, we also welcome excellent senior undergraduates to join the laboratory. Those with excellent performance will be recommended to Saudi Arabia, UAE and Hong Kong for further study. If you are interested in joining the team, please send your resume to 6120220231@bit.edu.cn
代表性学术成果
1. Lizhong Ding, Shizhong Liao, Yong Liu, Li Liu, Fan Zhu, Yazhou Yao, Ling Shao, Xin Gao. Approximate Kernel Selection via Matrix Approximation, IEEE Transactions on Neural Networks and Learning Systems (TNNLS), 2020, 31(11): 4881-4891.
2. Lizhong Ding, Mengyang Yu, Li Liu, Fan Zhu, Yong Liu, Yu Li, Ling Shao. Two Generator Game: Learning to Sample via Linear Goodness-of-Fit Test, Proceedings of the 33rd Annual Conference on Neural Information Processing Systems (NeurIPS), 2019.
3. Yong Liu, Shizhong Liao, Shali Jiang, Lizhong Ding, Hailun Lin, and Weiping Wang. IEEE Transactions on Pattern Analysis and Machine Intelligence (PAMI), 2019, 42(53):1083-1096.
4. Lizhong Ding, Zhi Liu, Yu Li, Shizhong Liao, Yong Liu, Peng Yang, Ge Yu, Ling Shao, Xin Gao. Linear Kernel Tests via Empirical Likelihood for High-Dimensional Data, Proceedings of the 33rd AAAI Conference on Artificial Intelligence (AAAI), 2019.
5. Lizhong Ding, Yong Liu, Shizhong Liao, Yu Li, Peng Yang, Yijie Pan, Chao Huang, Ling Shao, and Xin Gao. Proceedings of the 33rd AAAI Conference on Artificial Intelligence (AAAI), pp. 3462-3469, 2019.
6. Yazhou Yao, Zeren Sun, Fumin Shen, Li Liu, Limin Wang, Fan Zhu, Lizhong Ding, Gangshan Wu, and Ling Shao. Proceedings of the 28th International Joint Conference on Artificial Intelligence (IJCAI), pp. 996-1002, 2019.
7. Yu Li, Chao Huang, Lizhong Ding, Zhongxiao Li, Yijie Pan, and Xin Gao. Methods, vol. 166, pp. 4-21, 2019.
8. Lizhong Ding, Shizhong Liao, Yong Liu, Peng Yang, Xin Gao. Randomized Kernel Selection with Spectra of Multilevel Circulant Matrices, Proceedings of the 32nd AAAI Conference on Artificial Intelligence (AAAI), 2018.
9. Li Jian, Yong Liu, Rong Yin, Hua Zhang, Lizhong Ding, and Weiping Wang. Advances in Neural Information Processing Systems (NeurIPS), pp. 1591-1600, 2018.
10. Liu Yong, Hailun Lin, Lizhong Ding, Weiping Wang. Proceedings of the 27th International Joint Conference on Artificial Intelligence (IJCAI), pp. 2497-2503, 2018.
11. Lizhong Ding, Shizhong Liao. An Approximate Approach to Automatic Kernel Selection, IEEE Transactions on Cybernetics (TCYB), 2017, 3(47): 554-565.
12. Lizhong Ding, Shizhong Liao. Approximate consistency: Towards foundations of approximate kernel selection. Proceedings of the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Database (ECML PKDD), 354--369, 2014
13. Lizhong Ding, Shizhong Liao. Model selection with the covering number of the ball of RKHS. Proceedings of the 23rd ACM International Conference on Information and Knowledge Management (CIKM), 1159--1168, 2014.
所获奖励