首页    期刊浏览 2024年12月02日 星期一
登录注册

文章基本信息

  • 标题:Near-Optimal NP-Hardness of Approximating MAX k-CSPR
  • 本地全文:下载
  • 作者:Pasin Manurangsi ; Preetum Nakkiran ; Luca Trevisan
  • 期刊名称:Theory of Computing
  • 印刷版ISSN:1557-2862
  • 电子版ISSN:1557-2862
  • 出版年度:2022
  • 卷号:18
  • 页码:1-29
  • DOI:10.4086/toc.2022.v018a003
  • 语种:English
  • 出版社:University of Chicago
  • 摘要:We prove almost optimal hardness for MAX k-CSPR. In MAX k-CSPR, we are given a set of constraints, each of which depends on at most k variables. Each variable can take any value from 1,2,...,R. The goal is to find an assignment to variables that maximizes the number of satisfied constraints. We show that, for any k ≥ 2 and R ≥ 16, it is NP-hard to approximate MAX k-CSPR to within factor k O(k) (logR) k/2/R k−1 . In the regime where 3 ≤ k = o(logR/loglogR), this ratio improves upon Chan’s O(k/R k−2 ) factor NP-hardness of approximation of MAX k-CSPR (J. ACM 2016). Moreover, when k = 2, our result matches the best known hardness result of Khot, Kindler, Mossel and O’Donnell (SIAM J. Comp. 2007). We remark here that NPhardness of an approximation factor of 2 O(k) log(kR)/R k−1 is implicit in the (independent) work of Khot and Saket (ICALP 2015), which is better than our ratio for all k ≥ 3. In addition to the above hardness result, by extending an algorithm for MAX 2-CSPR by Kindler, Kolla and Trevisan (SODA 2016), we provide an Ω(logR/R k−1 )-approximation algorithm for MAX k-CSPR. Thanks to Khot and Saket’s result, this algorithm is tight up to a factor of O(k 2 ) when k ≤ R O(1) . In comparison, when 3 ≤ k is a constant, the previously best known algorithm achieves an O(k/R k−1 )-approximation for the problem, which is a factor of O(k logR) from the inapproximability ratio in contrast to our gap of O(k 2 ).
  • 关键词:constraint satisfaction;hardness of approximation;approximation algorithm
国家哲学社会科学文献中心版权所有