We present a randomized algorithm for producing a quasi-optimal hierarchically semi-separable (HSS) approximation to an
N×N matrix
A using only matrix-vector products with
A and
AT. We prove that, using
O(klog(N/k)) matrix-vector products and
O(Nk2log(N/k)) additional runtime, the algorithm returns an HSS matrix
B with rank-
k blocks whose expected Frobenius norm error
E[∥A−B∥F2] is at most
O(log(N/k)) times worse than the best possible approximation error by an HSS rank-
k matrix. In fact, the algorithm we analyze in a simple modification of an empirically effective method proposed by [Levitt & Martinsson, SISC 2024]. As a stepping stone towards our main result, we prove two results that are of independent interest: a similar guarantee for a variant of the algorithm which accesses
A's entries directly, and explicit error bounds for near-optimal subspace approximation using projection-cost-preserving sketches. To the best of our knowledge, our analysis constitutes the first polynomial-time quasi-optimality result for HSS matrix approximation, both in the explicit access model and the matrix-vector product query model.