We obtain minimax lower and upper bounds for the expected distortion redundancy of empirically designed vector quantizers. We show that the mean squared distortion of a vector quantizer designed from $n$ i.i.d. data points using any design algorithm is at least $\Omega (n^{-1/2})$ away from the optimal distortion for some distribution on a bounded subset of ${\cal R}^d$. Together with existing upper bounds this result shows that the minimax distortion redundancy for empirical quantizer design, as ...
We obtain minimax lower and upper bounds for the expected distortion redundancy of empirically designed vector quantizers. We show that the mean squared distortion of a vector quantizer designed from $n$ i.i.d. data points using any design algorithm is at least $\Omega (n^{-1/2})$ away from the optimal distortion for some distribution on a bounded subset of ${\cal R}^d$. Together with existing upper bounds this result shows that the minimax distortion redundancy for empirical quantizer design, as a function of the size of the training data, is asymptotically on the order of $n^{1/2}$. We also derive a new upper bound for the performance of the empirically optimal quantizer.
+