首页    期刊浏览 2024年12月04日 星期三
登录注册

文章基本信息

  • 标题:SVM via Saddle Point Optimization: New Bounds and Distributed Algorithms
  • 作者:Lingxiao Huang ; Yifei Jin ; Jian Li
  • 期刊名称:LIPIcs : Leibniz International Proceedings in Informatics
  • 电子版ISSN:1868-8969
  • 出版年度:2018
  • 卷号:101
  • 页码:25:1-25:13
  • DOI:10.4230/LIPIcs.SWAT.2018.25
  • 出版社:Schloss Dagstuhl -- Leibniz-Zentrum fuer Informatik
  • 摘要:We study two important SVM variants: hard-margin SVM (for linearly separable cases) and nu-SVM (for linearly non-separable cases). We propose new algorithms from the perspective of saddle point optimization. Our algorithms achieve (1-epsilon)-approximations with running time O~(nd+n sqrt{d / epsilon}) for both variants, where n is the number of points and d is the dimensionality. To the best of our knowledge, the current best algorithm for nu-SVM is based on quadratic programming approach which requires Omega(n^2 d) time in worst case [Joachims, 1998; Platt, 1999]. In the paper, we provide the first nearly linear time algorithm for nu-SVM. The current best algorithm for hard margin SVM achieved by Gilbert algorithm [Gärtner and Jaggi, 2009] requires O(nd / epsilon) time. Our algorithm improves the running time by a factor of sqrt{d}/sqrt{epsilon}. Moreover, our algorithms can be implemented in the distributed settings naturally. We prove that our algorithms require O~(k(d +sqrt{d/epsilon})) communication cost, where k is the number of clients, which almost matches the theoretical lower bound. Numerical experiments support our theory and show that our algorithms converge faster on high dimensional, large and dense data sets, as compared to previous methods.
  • 关键词:nu-SVM; hard-margin SVM; saddle point optimization; distributed algorithm
Loading...
联系我们|关于我们|网站声明
国家哲学社会科学文献中心版权所有