Page 1 of 1

Benchmarking Google's TPUv2

Posted: Fri Feb 23, 2018 10:48 pm
by clumma
https://blog.riseml.com/benchmarking-go ... 1c03b71384

TL;DR About 3x faster for twice the price (compared to a V100 instance on Amazon).

-Carl

Re: Benchmarking Google's TPUv2

Posted: Fri Feb 23, 2018 10:51 pm
by CheckersGuy
I hope that the prices of the tpu will still drop. I suppose the there arent many of those fancy tpu's chip yet and therefore the price is where it is right now. The price should drop once google manufactures more of those chips, not for their own datacenters, but for google cloud

Re: Benchmarking Google's TPUv2

Posted: Mon Feb 26, 2018 9:05 pm
by Leo
Imagine hypothetically that you now have a Tensor Processing Unit. Can we use it for chess? Can we program it? I don't understand the thing.

Re: Benchmarking Google's TPUv2

Posted: Mon Feb 26, 2018 9:11 pm
by AlvaroBegue
Leo wrote:Imagine hypothetically that you now have a Tensor Processing Unit. Can we use it for chess? Can we program it? I don't understand the thing.
We can use it to evaluate convolutional neural networks. If you have a program like AlphaZero that makes heavy use of CNNs, it will run much much faster on a TPU than on a CPU.

Re: Benchmarking Google's TPUv2

Posted: Tue Feb 27, 2018 12:45 am
by Leo
OK

Re: Benchmarking Google's TPUv2

Posted: Fri Mar 02, 2018 8:14 am
by bhamadicharef
Eric B. Olsen shared his "Proposal for a High Precision Tensor Processing Unit" on arVix.org at https://arxiv.org/abs/1706.03251

This is interesting for those who want to try to build their own TPU
on a FPGA for example.