首页    期刊浏览 2025年02月27日 星期四
登录注册

文章基本信息

  • 标题:GPU-accelerated Exhaustive Verification of the Collatz Conjecture
  • 本地全文:下载
  • 作者:Takumi Honda ; Yasuaki Ito ; Koji Nakano
  • 期刊名称:International Journal of Networking and Computing
  • 印刷版ISSN:2185-2847
  • 出版年度:2017
  • 卷号:7
  • 期号:1
  • 页码:69-85
  • 语种:English
  • 出版社:International Journal of Networking and Computing
  • 摘要:The main contribution of this paper is to present an implementation that performs the exhaustive search to verify the Collatz conjecture using a GPU. Consider the following operation on an arbitrary positive number: if the number is even, divide it by two, and if the number is odd, triple it and add one. The Collatz conjecture asserts that, starting from any positive number m, repeated iteration of the operations eventually produces the value 1. We have implemented it on NVIDIA GeForce GTX TITAN~X and evaluated the performance. The experimental results show that, our GPU implementation can verify 1.31x10^12 64-bit numbers per second. While the sequential CPU implementation on Intel Core i7-4790 can verify 5.25x10^9 64-bit numbers per second. Thus, our implementation on the GPU attains a speed-up factor of 249 over the sequential CPU implementation. Additionally, we accelerated the computation of counting the number of the above operations until a number reaches 1, called delay that is one of the mathematical interests for the Collatz conjecture by the GPU. Using a similar idea, we achieved a speed-up factor of 73.
  • 其他摘要:The main contribution of this paper is to present an implementation that performs the exhaustive search to verify the Collatz conjecture using a GPU. Consider the following operation on an arbitrary positive number: if the number is even, divide it by two, and if the number is odd, triple it and add one. The Collatz conjecture asserts that, starting from any positive number m, repeated iteration of the operations eventually produces the value 1. We have implemented it on NVIDIA GeForce GTX TITAN~X and evaluated the performance. The experimental results show that, our GPU implementation can verify 1.31x10^12 64-bit numbers per second. While the sequential CPU implementation on Intel Core i7-4790 can verify 5.25x10^9 64-bit numbers per second. Thus, our implementation on the GPU attains a speed-up factor of 249 over the sequential CPU implementation. Additionally, we accelerated the computation of counting the number of the above operations until a number reaches 1, called delay that is one of the mathematical interests for the Collatz conjecture by the GPU. Using a similar idea, we achieved a speed-up factor of 73.
  • 关键词:Collatz conjecture;GPGPU;Parallel processing;Exhaustive verification;Coalesced access;Bank conflict
国家哲学社会科学文献中心版权所有