Sparse solutions to classes of optimization problems has been a major concern for optimization problems arising from many disciplines such as image processing and portfolio selection. It has been observed for long that for numerous classes of quadratic optimization problems such as the standard quadratic programming problem (StQP), though theoretically intractable, there always exist sparse optimal solutions for instances from many real-world applications (see ). In this project, we try to develop a new theoretical framework to establish the existence of sparse optimal or approximate solutions to classes of intractable and non-convex QPs and derive precise probabilistic characterization of the sparsity of the sparsest optimal or approximate solutions to the underlying QPs. This theoretical insight will be used to design new provably-good efficient algorithms and the new techniques will be applied to portfolio selection and computer vision.
Sample Publications and Reference
•  X. Chen, J. Peng and Sh. Zhang. Existence of sparse solutions to standard quadratic programming problems with random matrices. Mathematical ProgrammingVol 141(1),273–293, 2013.
•  L. Mukherjee, V. Singh, J. Peng and C. Hinrichs. Learning Kernels for variants of Normalized Cuts: Convex Relaxations and Applications. Proceedings of Conference on Computer Vision and Pattern Recognition (CVPR), June 2010. (acceptance 27%).
•  L. Mukherjee, V. Singh, J. Peng. Scale invariant cosegmentation for image groups. Proceedings of Conference on Computer Vision and Pattern Recognition (CVPR), June 2011. ( Oral presentation, acceptance ratio 3.5%).
•  X. Chen and J. Peng. New analysis on sparse solutions to StQP with random data. Forthcoming in Mathematics of Operations Research, 2015.
- Principal Investigator: Jiming Peng.
- PhD student: Jingnan Chen.
- Other Collaborators: Xin Chen, Shuzhong Zhang.
- Sponsor: NSF-CMMI, 2011–2015.