https://blog.riseml.com/benchmarking-go ... 1c03b71384
TL;DR About 3x faster for twice the price (compared to a V100 instance on Amazon).
-Carl
Benchmarking Google's TPUv2
Moderators: hgm, Rebel, chrisw
-
- Posts: 186
- Joined: Fri Oct 10, 2014 10:05 pm
- Location: Berkeley, CA
-
- Posts: 273
- Joined: Wed Aug 24, 2016 9:49 pm
Re: Benchmarking Google's TPUv2
I hope that the prices of the tpu will still drop. I suppose the there arent many of those fancy tpu's chip yet and therefore the price is where it is right now. The price should drop once google manufactures more of those chips, not for their own datacenters, but for google cloud
-
- Posts: 1080
- Joined: Fri Sep 16, 2016 6:55 pm
- Location: USA/Minnesota
- Full name: Leo Anger
Re: Benchmarking Google's TPUv2
Imagine hypothetically that you now have a Tensor Processing Unit. Can we use it for chess? Can we program it? I don't understand the thing.
Advanced Micro Devices fan.
-
- Posts: 931
- Joined: Tue Mar 09, 2010 3:46 pm
- Location: New York
- Full name: Álvaro Begué (RuyDos)
Re: Benchmarking Google's TPUv2
We can use it to evaluate convolutional neural networks. If you have a program like AlphaZero that makes heavy use of CNNs, it will run much much faster on a TPU than on a CPU.Leo wrote:Imagine hypothetically that you now have a Tensor Processing Unit. Can we use it for chess? Can we program it? I don't understand the thing.
-
- Posts: 1080
- Joined: Fri Sep 16, 2016 6:55 pm
- Location: USA/Minnesota
- Full name: Leo Anger
-
- Posts: 31
- Joined: Fri Nov 25, 2016 10:14 am
- Location: Singapore
Re: Benchmarking Google's TPUv2
Eric B. Olsen shared his "Proposal for a High Precision Tensor Processing Unit" on arVix.org at https://arxiv.org/abs/1706.03251
This is interesting for those who want to try to build their own TPU
on a FPGA for example.
This is interesting for those who want to try to build their own TPU
on a FPGA for example.
Brahim HAMADICHAREF
Singapore
Singapore