We prove the following type of discrete entropy monotonicity for sums of isotropic, log-concave, independent and identically distributed random vectors
X1,…,Xn+1 on
Zd:
H(X1+⋯+Xn+1)≥H(X1+⋯+Xn)+2dlog(nn+1)+o(1), where
o(1) vanishes as
H(X1)→∞. Moreover, for the
o(1)-term, we obtain a rate of convergence
O(H(X1)e−d1H(X1)), where the implied constants depend on
d and
n. This generalizes to
Zd the one-dimensional result of the second named author (2023). As in dimension one, our strategy is to establish that the discrete entropy
H(X1+⋯+Xn) is close to the differential (continuous) entropy
h(X1+U1+⋯+Xn+Un), where
U1,…,Un are independent and identically distributed uniform random vectors on
[0,1]d and to apply the theorem of Artstein, Ball, Barthe and Naor (2004) on the monotonicity of differential entropy. In fact, we show this result under more general assumptions than log-concavity, which are preserved up to constants under convolution. In order to show that log-concave distributions satisfy our assumptions in dimension
d≥2, more involved tools from convex geometry are needed because a suitable position is required. We show that, for a log-concave function on
Rd in isotropic position, its integral, barycenter and covariance matrix are close to their discrete counterparts. Moreover, in the log-concave case, we weaken the isotropicity assumption to what we call almost isotropicity. One of our technical tools is a discrete analogue to the upper bound on the isotropic constant of a log-concave function, which extends to dimensions
d≥1 a result of Bobkov, Marsiglietti and Melbourne (2022).