The estimation of large sparse inverse covariance matrices is an ubiquitous statistical problem in many application areas such as mathematical finance, geology, health, and many others. Numerical approaches typically rely on the maximum likelihood estimation or its negativ log-likelihood function. When the Gaussian mean random field is expected to be sparse, regularization techniques which add a sparsity prior such as the $l_1$-regularization have become popular to address this issue. This leads to a convex but nondifferentiable target function. Recently, a quadratic approximate inverse covariance (QUIC) method was proposed. The hallmark of this method is its superlinear to quadratic convergence which makes it among the most competitive methods. In this paper we will present a sparse version of this method and we will show that using advanced sparse matrix technology, the sparse version of QUIC is easily able to deal with problems of size one million within a few minutes on modern multicore computers. We demonstrate the effectiveness and scalability of our method on several large-scale synthetic and real-world data sets including financial linear regression models.