If GPUs are so much better at parallel processing tasks, such as cracking passwords, than CPUs, would they be better for Chess as well?
https://mytechencounters.wordpress.com/ ... phic-card/
GPUs better for chess than CPUs?
Moderators: hgm, Rebel, chrisw
-
- Posts: 27811
- Joined: Fri Mar 10, 2006 10:06 am
- Location: Amsterdam
- Full name: H G Muller
Re: GPUs better for chess than CPUs?
Because Chess algorithms are by nature serial rather than parallel.
-
- Posts: 2651
- Joined: Fri Mar 17, 2006 6:05 am
Re: GPUs better for chess than CPUs?
A search tree is, by it's nature, parallel, and not serial.hgm wrote:Because Chess algorithms are by nature serial rather than parallel.
-
- Posts: 150
- Joined: Sat Jan 30, 2010 10:54 am
- Location: Israel
Re: GPUs better for chess than CPUs?
Not really. To know what branches of the tree are not even worth considering you must calculate other branches first.parrish wrote:A search tree is, by it's nature, parallel, and not serial.hgm wrote:Because Chess algorithms are by nature serial rather than parallel.
-
- Posts: 27811
- Joined: Fri Mar 10, 2006 10:06 am
- Location: Amsterdam
- Full name: H G Muller
Re: GPUs better for chess than CPUs?
Not if you want to employ alpha-beta pruning. If you would search all branches from a certain node in parallel, and one of them would give you a beta cutoff, you can no longer prune all the other branches. Because you had already searched them in parallel. Which was a complete waste of time. So naively parallelizing the tree search will waste about 99.999999% of the CPU power. You will need an awful lot of parallelism to make up for that...parrish wrote:A search tree is, by it's nature, parallel, and not serial.
As to GPUs: I am not really an expert on them, but I thought GPUs were not so much parallel processors, but rathere (SIMD) array processors. Or am I completely wrong on that?
-
- Posts: 4052
- Joined: Thu May 15, 2008 9:57 pm
- Location: Berlin, Germany
- Full name: Sven Schüle
Re: GPUs better for chess than CPUs?
GPUs can execute a high number of parallel threads, e.g. 512 in case of NVIDIA GeForce 8800 (16 Streaming Multiprocessors x 32 threads each). Maybe chess programming could make use of that somehow. I could imagine that it could be attractive to calculate all subtrees of an ALL node in parallel, for instance. By "all subtrees" I possibly mean "all but the first", since I think we need at least the result of the first subtree to decide what is most likely the type of a node we have predicted as an ALL node. Now if we consume, say, 40 threads at an ALL node we might still have enough threads (about 470) available for other tasks, so we can do the same thing a couple of times in different sections of the tree.hgm wrote:Not if you want to employ alpha-beta pruning. If you would search all branches from a certain node in parallel, and one of them would give you a beta cutoff, you can no longer prune all the other branches. Because you had already searched them in parallel. Which was a complete waste of time. So naively parallelizing the tree search will waste about 99.999999% of the CPU power. You will need an awful lot of parallelism to make up for that...parrish wrote:A search tree is, by it's nature, parallel, and not serial.
As to GPUs: I am not really an expert on them, but I thought GPUs were not so much parallel processors, but rathere (SIMD) array processors. Or am I completely wrong on that?
The search along the PV could remain the task of only few threads, as it is now. But the hard task of digging through all the garbage (e.g. ALL nodes) where failing low is most likely could be assigned to a large group of slave threads who would finish their work "almost instantly".
Dreaming, of course ... A lot of open issues, I know ... But without dreaming it is possible that there will be no progress
Sven
-
- Posts: 900
- Joined: Tue Apr 27, 2010 3:48 pm
Re: GPUs better for chess than CPUs?
You have to find a way to do alpha-beta search efficiently with lots of threads.
You have to shoehorn move generation / move making / evaluation into GPU code (it's not even close to a natural programming paradigm for those things... for one thing, branching and recursion aren't present).
A pipe dream basically.
You have to shoehorn move generation / move making / evaluation into GPU code (it's not even close to a natural programming paradigm for those things... for one thing, branching and recursion aren't present).
A pipe dream basically.
-
- Posts: 348
- Joined: Sat Feb 27, 2010 12:21 am
Re: GPUs better for chess than CPUs?
My prediction is that someone new to computer chess who doesn'trbarreira wrote:You have to find a way to do alpha-beta search efficiently with lots of threads.
You have to shoehorn move generation / move making / evaluation into GPU code (it's not even close to a natural programming paradigm for those things... for one thing, branching and recursion aren't present).
A pipe dream basically.
yet know it is extremely hard will find a method anyway, and after
that everyone else will copy and beat him.
-
- Posts: 12542
- Joined: Wed Mar 08, 2006 8:57 pm
- Location: Redmond, WA USA
Re: GPUs better for chess than CPUs?
There are some difficulties with GPUs and chess.
1. GPUs do not handle recursion
2. Transfer of data from the PC memory to the GPU memory is expensive.
There are some efforts to get the GPU approach to work. So far, they have not been very successful.
1. GPUs do not handle recursion
2. Transfer of data from the PC memory to the GPU memory is expensive.
There are some efforts to get the GPU approach to work. So far, they have not been very successful.
-
- Posts: 900
- Joined: Tue Apr 27, 2010 3:48 pm
Re: GPUs better for chess than CPUs?
Why would that happen with computer chess specifically? Or do you mean that you think GPUs are universally good for all problems?marcelk wrote:My prediction is that someone new to computer chess who doesn'trbarreira wrote:You have to find a way to do alpha-beta search efficiently with lots of threads.
You have to shoehorn move generation / move making / evaluation into GPU code (it's not even close to a natural programming paradigm for those things... for one thing, branching and recursion aren't present).
A pipe dream basically.
yet know it is extremely hard will find a method anyway, and after
that everyone else will copy and beat him.
It could happen, but the odds are against it because no one thinks that GPUs are good for general problem solving. GPUs are good at applying a small set of simple operations to a large number of independent data units. That doesn't describe a tree search in any way...