Khai Nguyen
Ph.D. Candidate at Department of Statistics and Data Sciences, University of Texas at Austin
Hi! I’m Khai, a fourth-year Ph.D. candidate at Department of Statistics and Data Sciences, University of Texas at Austin. I am fortunate to be advised by Professor Nhat Ho and Professor Peter Müller, and to be associated with Institute for Foundations of Machine Learning (IFML). I graduated from Hanoi University of Science and Technology with a Computer Science Bachelor’s degree. Before joining UT Austin, I was an AI Research Resident at VinAI Research under the supervision of Dr. Hung Bui.
Research: My research focuses on both fundamental problems and applied problems in probabilistic machine learning, deep learning, and statistics.
1. Computational Optimal Transport. My research makes Optimal Transport scalable in statistical inference (low time complexity, low space complexity, low sample complexity) via the one-dimensional projection approach which is known as sliced optimal transport (sliced Wasserstein distance). My work focuses on three key aspects of sliced Wasserstein: numerical approximation, projecting operator, and slicing distribution.
2. Efficiency, Scalability, Interpretability, and Trustworthiness of AI. My research enhances the performance of 3D vision models, speeds up the training of generative models, adapts prediction models to new unseen domains, explains multimodal transferable representation, and ensures fairness and robustness in learning processes.
News
Sep 26, 2024 | 1 paper Hierarchical Hybrid Sliced Wasserstein: A Scalable Metric for Heterogeneous Joint Distributions is accepted at NeurIPS 2024. |
---|---|
May 1, 2024 | 1 paper Sliced Wasserstein with Random-Path Projecting Directions is accepted at ICML 2024. |
Feb 27, 2024 | 1 paper Integrating Efficient Optimal Transport and Functional Maps For Unsupervised Shape Correspondence Learning is accepted at CVPR 2024. |
Jan 19, 2024 | 2 papers Towards Convergence Rates for Parameter Estimation in Gaussian-gated Mixture of Experts, On Parameter Estimation in Deviated Gaussian Mixture of Experts are accepted at AISTATS 2024. |
Jan 16, 2024 | 4 papers Quasi-Monte Carlo for 3D Sliced Wasserstein - Spotlight Presentation, Sliced Wasserstein Estimation with Control Variates, Diffeomorphic Deformation via Sliced Wasserstein Distance Optimization for Cortical Surface Reconstruction, and Revisiting Deep Audio-Text Retrieval Through the Lens of Transportation are accepted at ICLR 2024. |
Sep 21, 2023 | 4 papers Energy-Based Sliced Wasserstein Distance, Markovian sliced Wasserstein distances: Beyond independent projections, Designing robust Transformers using robust kernel density estimation, and Minimax optimal rate for parameter estimation in multivariate deviated models are accepted at NeurIPS 2023. |
Apr 24, 2023 | 1 paper Self-Attention Amortized Distributional Projection Optimization for Sliced Wasserstein Point-Cloud Reconstruction is accepted at ICML 2023. |
Jan 20, 2023 | 1 paper Hierarchical Sliced Wasserstein Distance is accepted at ICLR 2023. |
Sep 14, 2022 | 4 papers Revisiting Sliced Wasserstein on Images: From Vectorization to Convolution, Amortized Projection Optimization for Sliced Wasserstein Generative Models, Improving Transformer with an Admixture of Attention Heads , and FourierFormer: Transformer Meets Generalized Fourier Integral Theorem are accepted at NeurIPS 2022. |
Apr 24, 2022 | 2 papers Improving Mini-batch Optimal Transport via Partial Transportation and On Transportation of Mini-batches: A Hierarchical Approach are accepted at ICML 2022. |
Jan 24, 2021 | 2 papers Distributional Sliced-Wasserstein and Applications to Generative Modeling - Spotlight Presentation and DImproving Relational Regularized Autoencoders with Spherical Sliced Fused Gromov Wasserstein are accepted at ICLR 2021. |