In this paper, we give surprisingly efficient algorithms for list-decoding and testing{\em random} linear codes. Our main result is that random sparse linear codes are locally testable and locally list-decodablein the {\em high-error} regime with only a {\em constant} number of queries.More precisely, we show that for all constants 0">c0 and 0">0 , and for every linear code 01N which is:\begin{itemize}\item {\em sparse}: \calCNc , and\item {\em unbiased}: each nonzero codeword in has weight (21−N−21+N−),\end{itemize} is locally testable and locally list-decodable from (21−)-fraction worst-case errors using only \poly(1) queries to a received word. We also give {\em sub-exponential time} algorithms for list-decoding arbitrary unbiased (but not necessarily sparse) linear codes in the high-error regime. In particular, this yields the first sub-exponential time algorithm even for the problem of (unique) decoding random linear codes of inverse-polynomial rate from a fixed positive fraction of errors.
Earlier, Kaufman and Sudan had shown that sparse, unbiased codescan be locally (unique) decoded and locally tested from a constant-fraction of errors, where this constant-fractiontends to 0 as the number of codewords grows. Our results significantly strengthen their results, while alsohaving significantly simpler proofs.
At the heart of our algorithms is a natural ``self-correcting" operation defined on codes and received words.This self-correcting operation transforms a code with a received word w into a simpler code and a related received word w, such that w is close to if and only if w is close to .Starting with a sparse, unbiased code and an arbitrary received word w,a constant number of applications of the self-correctingoperation reduces us to the case of local list-decoding and testing for the {\em Hadamard code}, for which the well knownalgorithms of Goldreich-Levin and Blum-Luby-Rubinfeld are available. This yields the constant-query local algorithms forthe original code .
Our algorithm for decoding unbiased linear codes in sub-exponential time proceeds similarly. Applying the self-correcting operation to an unbiased code and an arbitrary received word asuper-constant number of times, we get reduced to the problem of{\em learning noisy parities}, for which non-trivial sub-exponential time algorithms were recently given byBlum-Kalai-Wasserman and Feldman-Gopalan-Khot-Ponnuswami.Our result generalizes a result of Lyubashevsky, which gave sub-exponential time algorithms for decoding random linear codes of inverse-polynomial rate, from {\em random errors}.