摘要:We prove a non-asymptotic concentration inequality for the spectral norm of sparse inhomogeneous random tensors with Bernoulli entries. For an order-k inhomogeneous random tensor T with sparsity pmax≥clogn n, we show that ‖T−ET‖=O(np maxlogk−2(n)) with high probability. The optimality of this bound up to polylog factors is provided by an information theoretic lower bound. By tensor unfolding, we extend the range of sparsity to pmax≥clogn nm with 1≤m≤k−1 and obtain concentration inequalities for different sparsity regimes. We also provide a simple way to regularize T such that O(nmpmax) concentration still holds down to sparsity pmax≥c nm with k∕2≤m≤k−1. We present our concentration and regularization results with two applications: (i) a randomized construction of hypergraphs of bounded degrees with good expander mixing properties, (ii) concentration of sparsified tensors under uniform sampling.