CPU vs GPU point rewards
Moderators: Site Moderators, FAHC Science Team
-
- Posts: 8
- Joined: Sat Mar 14, 2020 10:01 pm
CPU vs GPU point rewards
Hello,
I am new here, so this may seem like a silly question that has already been answered...
But I have tried searching the forum, but it says the keywords "points, cpu, gpu" are too common
So anyway, why are the points for CPU tasks much lower than points for GPU tasks?
Today I installed the client, started working on some tasks (although a bit dissatisfied, there were urging messages that you need new donors for COVID research, but I was not assigned any COVID task, my GPU was idling half of the day...)
And I noticed, that for a task, that ran ~6h on my CPU, I got 733 points, while for a task that ran ~3h on my GPU, I got 49k points.
You may argue, that GPU computing is better, does more job at he same time, so OK, I would accept that, but...
But the CPU task consumed ~40W of power (for 6hrs), while the GPU task consumed ~100W for 3hrs, (so it consumed ~20-50% more power, but yielded 6600% more points) which means that from the points per money perspective, the GPU task was much superior.
I mean, what is the point of the points? If I saw this only as a game, I would cancel the CPU task, because it is not profiting.
I saw in FAQ on your site, that the GPU tasks are benchmarked on the CPU as well, that does not make much sence, since the architecture is different, and it may take the CPU much much much more time to compute, but the GPU computes it faster WITHOUT using that "much much much more" energy.
Or, to take it from the other perspective - if the GPU tasks can be done on CPU in your benchmarks, why can't we run the CPU tasks on our GPUs?
My GPU's been idling half a day, saying the server does not have anything for it to do, while my CPU was consuming power...
What is the difference between those tasks? Are there some special functions, that the GPU would not handle? Wouldn't it make sense then to award the CPU computing with more points?
Thanks for the answer - I am much interested in the computer-sciency answer as I try to understand the process.
Martin
I am new here, so this may seem like a silly question that has already been answered...
But I have tried searching the forum, but it says the keywords "points, cpu, gpu" are too common
So anyway, why are the points for CPU tasks much lower than points for GPU tasks?
Today I installed the client, started working on some tasks (although a bit dissatisfied, there were urging messages that you need new donors for COVID research, but I was not assigned any COVID task, my GPU was idling half of the day...)
And I noticed, that for a task, that ran ~6h on my CPU, I got 733 points, while for a task that ran ~3h on my GPU, I got 49k points.
You may argue, that GPU computing is better, does more job at he same time, so OK, I would accept that, but...
But the CPU task consumed ~40W of power (for 6hrs), while the GPU task consumed ~100W for 3hrs, (so it consumed ~20-50% more power, but yielded 6600% more points) which means that from the points per money perspective, the GPU task was much superior.
I mean, what is the point of the points? If I saw this only as a game, I would cancel the CPU task, because it is not profiting.
I saw in FAQ on your site, that the GPU tasks are benchmarked on the CPU as well, that does not make much sence, since the architecture is different, and it may take the CPU much much much more time to compute, but the GPU computes it faster WITHOUT using that "much much much more" energy.
Or, to take it from the other perspective - if the GPU tasks can be done on CPU in your benchmarks, why can't we run the CPU tasks on our GPUs?
My GPU's been idling half a day, saying the server does not have anything for it to do, while my CPU was consuming power...
What is the difference between those tasks? Are there some special functions, that the GPU would not handle? Wouldn't it make sense then to award the CPU computing with more points?
Thanks for the answer - I am much interested in the computer-sciency answer as I try to understand the process.
Martin
-
- Posts: 2522
- Joined: Mon Feb 16, 2009 4:12 am
- Location: Greenwood MS USA
Re: CPU vs GPU point rewards
Yes, if you intend to spend any money on this, buy large Graphics cards. Some can produce 2.5 million Points Per Day. (mine are old and produce 600 thousand PPD)
CPU folding is comparatively weak, but some parts of the problem need the more versatile Instruction Set of a CPU, so they still find work.
Back when I could still understand the discussion, GPUs had to treat water surrounding the Protein as a continuous medium, but CPUs could calculate individual water molecules surrounding the Protein. So is was more accurate but vastly slower.
A very expensive CPU has 256 CPU threads, top end GPUs have about 5000 (somewhat slower) threads so the brute force calculations are on GPUs while the boutique equations are on CPUs.
CPU folding is comparatively weak, but some parts of the problem need the more versatile Instruction Set of a CPU, so they still find work.
Back when I could still understand the discussion, GPUs had to treat water surrounding the Protein as a continuous medium, but CPUs could calculate individual water molecules surrounding the Protein. So is was more accurate but vastly slower.
A very expensive CPU has 256 CPU threads, top end GPUs have about 5000 (somewhat slower) threads so the brute force calculations are on GPUs while the boutique equations are on CPUs.
Last edited by JimboPalmer on Sat Mar 28, 2020 2:51 pm, edited 2 times in total.
Tsar of all the Rushers
I tried to remain childlike, all I achieved was childish.
A friend to those who want no friends
I tried to remain childlike, all I achieved was childish.
A friend to those who want no friends
Re: CPU vs GPU point rewards
A typical CPU can process maybe 8 threads concurrently but they're supported by only 4 FPUs. A typical GPU can process thousands (tens of thousands) floating point operations concurrently. FAH is really good at using all those resources in parallel.
Not everybody has a powerful GPU but everybody has a CPU. (even people with GPUs can use their CPUs concurrently) Small proteins can be processed at acceptable speeds on a CPU while large proteins are directed to GPUs.
In the early days of GPUs, there were some significant limitations but new hardware generations have progressed rapidly and software optimizations have helped significantly. For the most part, CPU hardware has progressed more slowly but it's still useful.
Not everybody has a powerful GPU but everybody has a CPU. (even people with GPUs can use their CPUs concurrently) Small proteins can be processed at acceptable speeds on a CPU while large proteins are directed to GPUs.
In the early days of GPUs, there were some significant limitations but new hardware generations have progressed rapidly and software optimizations have helped significantly. For the most part, CPU hardware has progressed more slowly but it's still useful.
Posting FAH's log:
How to provide enough info to get helpful support.
How to provide enough info to get helpful support.
-
- Posts: 8
- Joined: Sat Mar 14, 2020 10:01 pm
Re: CPU vs GPU point rewards
So then the tasks that can be processes by the GPU, should be processed by the GPU, and the tasks that can be processed only by the CPU should yield more pointsJimboPalmer wrote: So is was more accurate but vastly slower.
...
top end GPUs have about 5000 (somewhat slower) threads so the brute force calculations are on GPUs while the boutique equations are on CPUs.
By the way, I really don't care for the points, I joined to help fight the Coronavirus, not to gather points, but if there is some point system, it should be fair
-
- Posts: 1164
- Joined: Wed Apr 01, 2009 9:22 pm
- Hardware configuration: Asus Z8NA D6C, 2 x5670@3.2 Ghz, , 12gb Ram, GTX 980ti, AX650 PSU, win 10 (daily use)
Asus Z87 WS, Xeon E3-1230L v3, 8gb ram, KFA GTX 1080, EVGA 750ti , AX760 PSU, Mint 18.2 OS
Not currently folding
Asus Z9PE- D8 WS, 2 E5-2665@2.3 Ghz, 16Gb 1.35v Ram, Ubuntu (Fold only)
Asus Z9PA, 2 Ivy 12 core, 16gb Ram, H folding appliance (fold only) - Location: Jersey, Channel islands
Re: CPU vs GPU point rewards
The points system has long been debated, the easiest explanation is that the most points go to the results that are returned the quickest and also do the most science. The CPU cores haven't been updated in years, the gpu cores have to take advantage of new code that helps the speed of the science, hence more points for gpu. Also cpu wise most home pc's have only had 4/6 core cpu's, its only recently that AMD has been pushing mainstream 8+ core cpu's.
Re: CPU vs GPU point rewards
Not really true. GROMACS (used in the CPU FAHcore) was updates to use AVX rather than being limited to SSE2.Nathan_P wrote:\The CPU cores haven't been updated in years...
But OpenMM (used for GPUs) has been updated more recently.
Posting FAH's log:
How to provide enough info to get helpful support.
How to provide enough info to get helpful support.
-
- Posts: 2522
- Joined: Mon Feb 16, 2009 4:12 am
- Location: Greenwood MS USA
Re: CPU vs GPU point rewards
It is NOT Fair, it values quick results over slow results, that is why it is named the Quick Return Bonus.MartinPanda wrote: it should be fair
The researchers cannot assign the next set of work units until they get back all the previous set. so they reward speed and discourage abandoning WUs.
Getting results may not have been how you would have run this project, but it sure seems to be how F@H is organized now. They want results.
Fair is not the overriding goal of academic science, results are. "Publish or perish"
https://en.wikipedia.org/wiki/Publish_or_perish
Tsar of all the Rushers
I tried to remain childlike, all I achieved was childish.
A friend to those who want no friends
I tried to remain childlike, all I achieved was childish.
A friend to those who want no friends
-
- Posts: 8
- Joined: Sat Mar 14, 2020 10:01 pm
Re: CPU vs GPU point rewards
I don't dispute the bonus for faster delivery of the result; I completely understand that.JimboPalmer wrote: It is NOT Fair, it values quick results over slow results, that is why it is named the Quick Return Bonus.
The researchers cannot assign the next set of work units until they get back all the previous set. so they reward speed and discourage abandoning WUs.
Getting results may not have been how you would have run this project, but it sure seems to be how F@H is organized now. They want results.
I dispute that the CPU projects have much lower base points than GPU projects, yet they need similar time/energy to complete.
For example, now I have been assigned 2 projects, a CPU project with Base 243 points and GPU project with base 9405 points.
So basically, it would seem that the CPU projects are less important... If they want faster results, they may want to run these projects on GPUs instead. And if the calculations are so special that they cannot be made on the faster GPUs, then they should compensate it with more base points.
I mean, for now it seems, that I am wasting my CPU (and energy), since the CPU projects are not that important, apparently...
-
- Posts: 1996
- Joined: Sun Mar 22, 2020 5:52 pm
- Hardware configuration: 1: 2x Xeon E5-2697v3@2.60GHz, 512GB DDR4 LRDIMM, SSD Raid, Win10 Ent 20H2, Quadro K420 1GB, FAH 7.6.21
2: Xeon E3-1505Mv5@2.80GHz, 32GB DDR4, NVME, Win10 Pro 20H2, Quadro M1000M 2GB, FAH 7.6.21 (actually have two of these)
3: i7-960@3.20GHz, 12GB DDR3, SSD, Win10 Pro 20H2, GTX 750Ti 2GB, GTX 1080Ti 11GB, FAH 7.6.21 - Location: UK
Re: CPU vs GPU point rewards
Only you can decide if you are wasting your tie and money, but CPU projects are a this time important - they are good for certain types of work - true CPUs are slower from the perspective of the science and hence on a science based points system CPUs will always score lower points (and FAH is really about the science is it not).
The way I look at it is … I drive both a sports car and an SUV … The sports car gets me places quick and for the most part I am happy with that - the SUV however is slow/noisy/and guzzles fuel (gas) but I still drive it cause I choose to and for pulling people out of ditches it is a damn sight better than my sports car even though that is quicker and very much more fuel efficient … and there are times when we need to shift lots of stuff/people that both cars are used at the same time … I would love for fuel (or is it gas) stations to charge me a fraction of the cost to fill up the SUV than they do for the sports car - but they don't - and this might be seen as unfair?
I can beat myself up about the cost effectiveness of running two vehicles and the fact the SUV is far less value for money that the sports car … but I do choose to drive it and it is worth it to me
edit: actually at the moment we are in lockdown and neither car is any use - feels a bit similar to waiting for WUs to be assigned (only joking) !!
The way I look at it is … I drive both a sports car and an SUV … The sports car gets me places quick and for the most part I am happy with that - the SUV however is slow/noisy/and guzzles fuel (gas) but I still drive it cause I choose to and for pulling people out of ditches it is a damn sight better than my sports car even though that is quicker and very much more fuel efficient … and there are times when we need to shift lots of stuff/people that both cars are used at the same time … I would love for fuel (or is it gas) stations to charge me a fraction of the cost to fill up the SUV than they do for the sports car - but they don't - and this might be seen as unfair?
I can beat myself up about the cost effectiveness of running two vehicles and the fact the SUV is far less value for money that the sports car … but I do choose to drive it and it is worth it to me
edit: actually at the moment we are in lockdown and neither car is any use - feels a bit similar to waiting for WUs to be assigned (only joking) !!
2x Xeon E5-2697v3, 512GB DDR4 LRDIMM, SSD Raid, W10-Ent, Quadro K420
Xeon E3-1505Mv5, 32GB DDR4, NVME, W10-Pro, Quadro M1000M
i7-960, 12GB DDR3, SSD, W10-Pro, GTX1080Ti
i9-10850K, 64GB DDR4, NVME, W11-Pro, RTX3070
(Green/Bold = Active)
Xeon E3-1505Mv5, 32GB DDR4, NVME, W10-Pro, Quadro M1000M
i7-960, 12GB DDR3, SSD, W10-Pro, GTX1080Ti
i9-10850K, 64GB DDR4, NVME, W11-Pro, RTX3070
(Green/Bold = Active)
-
- Posts: 146
- Joined: Sun Jul 30, 2017 8:40 pm
Re: CPU vs GPU point rewards
Anyway, I would consider CPU folding with fast CPUs only.
CPU folding with slow 4 core CPUs is not worth it, imho.
CPU folding with slow 4 core CPUs is not worth it, imho.
-
- Posts: 1996
- Joined: Sun Mar 22, 2020 5:52 pm
- Hardware configuration: 1: 2x Xeon E5-2697v3@2.60GHz, 512GB DDR4 LRDIMM, SSD Raid, Win10 Ent 20H2, Quadro K420 1GB, FAH 7.6.21
2: Xeon E3-1505Mv5@2.80GHz, 32GB DDR4, NVME, Win10 Pro 20H2, Quadro M1000M 2GB, FAH 7.6.21 (actually have two of these)
3: i7-960@3.20GHz, 12GB DDR3, SSD, Win10 Pro 20H2, GTX 750Ti 2GB, GTX 1080Ti 11GB, FAH 7.6.21 - Location: UK
Re: CPU vs GPU point rewards
(indulging my own metaphor cause I have little else better to do at the moment !!)
"fast" can have various meanings … My sports car has a small high rev engine and does get me places fast but with little load - It might take 6/7 trips to carry the same amount of stuff a set distance as my SUV albeit far more fun - the SUV will get some jobs done quicker that my much faster sports car
What I am really saying is my relatively slow Xeons in my server may not be the fastest on the block but running two slots (24 and 30) does let me shift a fair few WUs in a relatively short time (usually above 250k ppd) - maybe not much by GPU standards but enough to help I reckon?
… and that is what it comes down to I believe … I also wouldn't use a slow 4 core if I had one, because I have other kit I can donate some cycles with that is much more useful … but if all I had was a slow 4 core (as long as it could meet the expiry times for WUs) I'd want to be part of FAH and make a positive contribution (if only a small one)
"fast" can have various meanings … My sports car has a small high rev engine and does get me places fast but with little load - It might take 6/7 trips to carry the same amount of stuff a set distance as my SUV albeit far more fun - the SUV will get some jobs done quicker that my much faster sports car
What I am really saying is my relatively slow Xeons in my server may not be the fastest on the block but running two slots (24 and 30) does let me shift a fair few WUs in a relatively short time (usually above 250k ppd) - maybe not much by GPU standards but enough to help I reckon?
… and that is what it comes down to I believe … I also wouldn't use a slow 4 core if I had one, because I have other kit I can donate some cycles with that is much more useful … but if all I had was a slow 4 core (as long as it could meet the expiry times for WUs) I'd want to be part of FAH and make a positive contribution (if only a small one)
2x Xeon E5-2697v3, 512GB DDR4 LRDIMM, SSD Raid, W10-Ent, Quadro K420
Xeon E3-1505Mv5, 32GB DDR4, NVME, W10-Pro, Quadro M1000M
i7-960, 12GB DDR3, SSD, W10-Pro, GTX1080Ti
i9-10850K, 64GB DDR4, NVME, W11-Pro, RTX3070
(Green/Bold = Active)
Xeon E3-1505Mv5, 32GB DDR4, NVME, W10-Pro, Quadro M1000M
i7-960, 12GB DDR3, SSD, W10-Pro, GTX1080Ti
i9-10850K, 64GB DDR4, NVME, W11-Pro, RTX3070
(Green/Bold = Active)
Re: CPU vs GPU point rewards
The base points do not rank importance of one project relative to another. They're intended to reflect the amount of simulation accomplished in each work unit. Performance/Watt is not part of the points and as you've noticed P/W favours GPUs.MartinPanda wrote:For example, now I have been assigned 2 projects, a CPU project with Base 243 points and GPU project with base 9405 points.
So basically, it would seem that the CPU projects are less important... If they want faster results, they may want to run these projects on GPUs instead. And if the calculations are so special that they cannot be made on the faster GPUs, then they should compensate it with more base points.
I posted a breakdown comparing CPU and GPU FLOPS in a similar thread.
-
- Posts: 2522
- Joined: Mon Feb 16, 2009 4:12 am
- Location: Greenwood MS USA
Re: CPU vs GPU point rewards
The amount of science it yields forms the Base Points of a work unit. The aptly named Quick Return Bonus multiplies that by how quickly you did that science.MartinPanda wrote:So then the tasks that can be processes by the GPU, should be processed by the GPU, and the tasks that can be processed only by the CPU should yield more points
Since the GPUs do a lot of work very quickly, they get the most points.
(To some degree, you are disagreeing with the Biochemists' goals, while I am just explaining how the software works. Biochemists resemble 2 year olds in this regard, they not only want their toys, they want them NOW)
Tsar of all the Rushers
I tried to remain childlike, all I achieved was childish.
A friend to those who want no friends
I tried to remain childlike, all I achieved was childish.
A friend to those who want no friends
-
- Posts: 8
- Joined: Sat Mar 14, 2020 10:01 pm
Re: CPU vs GPU point rewards
JimboPalmer wrote:The amount of science it yields forms the Base Points of a work unit.
Well, maybe that's it - how do you measure "the amount of science?" If you need double precision instead of single precision, is it double the science? Or squared amount of science? Or something else?Neil-B wrote:The way I look at it is … I drive both a sports car and an SUV …
If I took Neil's analogy with cars - sometimes we (people, companies) need to move some stuff from A to B. Lots of stuff. You can do it in your average sedan, which is OK for everyday use, but not good form moving large amounts of stuff (let's say if you tried to run FAH on a laptop or smartphone).
Of course it's much better to move the goods in trucks, that can carry much more load at the same time (GPUs).
But sometimes, when the things you want to move, are really complicated, and you can't just put them in trucks (because they are too heavy, too wide, or in FAH case, they need double precision), you use something, which... well, it's slower in number of things moved per unit of time, but they charge you for it adequately and for all the complications.
Like this. https://www.youtube.com/watch?v=3PH0idBnJj4 .
So if we fold on CPU only the things that cannot be done on GPU, then we should get "paid" adequately as well.
But OK, if you don't recommend folding of 4 core CPU, and you all got nice server processors at home (I wonder why), then I'll just donate my GPU time.
My CPU is waiting for assignments half of the time anyway, so I presume there is enough people with good CPUs to do the job faster. Correct conclusion?
-
- Posts: 1996
- Joined: Sun Mar 22, 2020 5:52 pm
- Hardware configuration: 1: 2x Xeon E5-2697v3@2.60GHz, 512GB DDR4 LRDIMM, SSD Raid, Win10 Ent 20H2, Quadro K420 1GB, FAH 7.6.21
2: Xeon E3-1505Mv5@2.80GHz, 32GB DDR4, NVME, Win10 Pro 20H2, Quadro M1000M 2GB, FAH 7.6.21 (actually have two of these)
3: i7-960@3.20GHz, 12GB DDR3, SSD, Win10 Pro 20H2, GTX 750Ti 2GB, GTX 1080Ti 11GB, FAH 7.6.21 - Location: UK
Re: CPU vs GPU point rewards
Please note I did write:
… but if all I had was a slow 4 core (as long as it could meet the expiry times for WUs) I'd want to be part of FAH and make a positive contribution (if only a small one) …
I am incredibly fortunate to have a fairly decent server I can dedicate at the moment to this effort - because of that I get to not use my laptop to fold that often (it does occasionally) … In the past I folded with a variety of bits of tin most of which pretty much only scraped in under the timeout - It was all I had.
So no I personally didn't recommend against using a 4 core CPU - I simply said I wouldn't at this time and then explained why.
… but if all I had was a slow 4 core (as long as it could meet the expiry times for WUs) I'd want to be part of FAH and make a positive contribution (if only a small one) …
I am incredibly fortunate to have a fairly decent server I can dedicate at the moment to this effort - because of that I get to not use my laptop to fold that often (it does occasionally) … In the past I folded with a variety of bits of tin most of which pretty much only scraped in under the timeout - It was all I had.
So no I personally didn't recommend against using a 4 core CPU - I simply said I wouldn't at this time and then explained why.
2x Xeon E5-2697v3, 512GB DDR4 LRDIMM, SSD Raid, W10-Ent, Quadro K420
Xeon E3-1505Mv5, 32GB DDR4, NVME, W10-Pro, Quadro M1000M
i7-960, 12GB DDR3, SSD, W10-Pro, GTX1080Ti
i9-10850K, 64GB DDR4, NVME, W11-Pro, RTX3070
(Green/Bold = Active)
Xeon E3-1505Mv5, 32GB DDR4, NVME, W10-Pro, Quadro M1000M
i7-960, 12GB DDR3, SSD, W10-Pro, GTX1080Ti
i9-10850K, 64GB DDR4, NVME, W11-Pro, RTX3070
(Green/Bold = Active)