While the Lagrange dual of linear programming is standard, it is not the case for convex quadratic programming. This ambiguity is caused by the rank deficiency of the Hessian of the objective function. In this talk, we shall introduce a restricted Wolfe dual for convex composite quadratic programming to address this issue. This restricted Wolfe dual possesses many nice theoretical properties that resemble those of linear conic programming and allows one to design efficient dual based augmented Lagrangian methods with guaranteed convergence. Interestingly, our study on the restricted Wolfe dual has also led to the proof of a symmetric Gauss-Seidel decomposition theorem, which in turn plays a critical role in the successful design of efficient accelerated proximal gradient methods for least squares optimization problems and the alternating direction methods of multipliers for multi-block convex optimization problems.