People/Web Search Calendar Emergency Info A-Z Index UVA Email University of Virginia

Computer Science Colloquia

Wednesday, November 4, 2015
Beilun Wang
Advisor: Jane Qi
Attending Faculty: Gabe Robins (Chair), Worthy Martin and James Cohoon

Rice Hall, Rm. 242 at 11:00 AM

PhD Qualifying Exam Presentation
Fast and Scalable Joint Estimators for Learning Multiple Related Sparse Gaussian Graphical Models

ABSTRACT
In this paper, we infer multiple sparse Gaussian Graphical Models (sGGMs) jointly from data samples of many tasks (large $K$) and under a high-dimension (large $p$) situation. Most previous studies for the joint estimation of multiple sGGMs rely on penalized log-likelihood estimators that involve expensive and difficult non-smooth optimization. We propose a novel approach, FASJEM (fast and scalable joint estimator for multiple sGGMs) for structure estimation of multiple sGGMs at a large scale. As the first study using the M-estimator framework, FASJEM has the following sound properties: (1) We solve FASJEM through an entry-wise manner which is parallelizable. (2) We choose a proximal algorithm to optimize the FASJEM. This improves the computational efficiency of FASJEM significantly from $O(Kp^3)$ to $O(Kp^2)$ and ease the memory requirement from $O(Kp^2)$ to $O(K)$. (3) We theoretically prove that FASJEM achieves a consistent estimation with $\max\{ O(\log(Kp)/n_{tot}), O(p\log(Kp/n_{tot})) \}$ convergence rate. On one synthetic and three real-world datasets, FASJEM shows significant improvements over baselines in terms of accuracy, computational complexity and memory requirements.