Page 1 of 1
Rationale for GPU WUs granting 20x more points than CPUs?
Posted: Mon Apr 27, 2020 3:47 pm
by BP2020
I'm reading the
Rules & Policies section of the folding@home website, in particular the
Best Practices, ("Folding@home (FAH) is a major scientific endeavor, but is also a kind of contest for some donors to see who can donate the most points.") regarding
The Project and Work Units (WUs). Maybe this is hardware dependent but in my case, it seems the rewards for GPU WUs are about 20-fold what they are for CPU WUs i.e. usually 50k for a 2-3 hrs GPU WUs vs 3k for a 2-5 hrs CPU WUs. So in my case not only do CPU WUs grant less points but they also take more time to complete. At this point in time, the ratio between GPU vs CPU donated to the project is about 1 to 4 (see
OS).
At the same time there is sometimes a shortage of WUs for GPUs. I also saw at least one donor donating only GPU time which yields lots of points (and some sense of entitlement if not necessarily more WUs completed, which is what this is about, WUs for medical research projects & science towards illnesses that kill or maim people such as with the
COVID-19 virus; AFAIK these points are not redeemable for a t-shirt, discount over a game or towards a gift card enabling disrespectful behavior/rant/trolling in the forum; and donations are more meaningful collectively than
individually, really).
In that context, especially that of the 1 to 4 ratio in a context where 1 is still close to 750k GPUs (i.e. a lot), which might be more than 6 months ago etc. what is the rationale again for CPU WUs being "worth" so much less than GPU WUs, is it just my (old CPU) hardware (i5-2500k, 3.3GHz) which skews my perception and does that rationale still hold today?
Re: Rationale for GPU WUs granting 20x more points than CPUs
Posted: Mon Apr 27, 2020 3:52 pm
by Neil-B
viewtopic.php?f=72&t=33496&hilit=qrb+difference has some discussion of this ... it comes down to the simple fact that GPUs process science that much quicker than CPUs.
The fastest CPUs atm can complete WUs at high 100k's of points a day which competes with the middle to lower end GPUs but not with the higher end GPUs (the low 1m's) ... and on "Points per watt" GPUs tend to smash CPUs out of the park ... it is for this reason that the "mining community" use GPU farms not CPU farms.
Re: Rationale for GPU WUs granting 20x more points than CPUs
Posted: Mon Apr 27, 2020 4:05 pm
by ajm
GPUs procure more points, but they are less flexible. A GPU folding can affect your user experience (if you use that GPU for your monitor), because this is an all-or-nothing kind of operation. CPU jobs on the other hand are unobtrusive: they will never prevent you from using your hardware when you need it (FAH has a low priority).
Re: Rationale for GPU WUs granting 20x more points than CPUs
Posted: Mon Apr 27, 2020 4:09 pm
by BP2020
@Neil-B Thank you, that discussion was very interesting and I understand better. Imho some material should be "sticky" in the New Donors thread, such as a post about the GPUs.txt and this, for instance. Search doesn't allow for too common/short words...
@ajm Unobtrusiveness is a good point too, had forgotten about that, thanks!
Re: Rationale for GPU WUs granting 20x more points than CPUs
Posted: Mon Apr 27, 2020 5:04 pm
by AlanTheBeast
Personally I've just set up hardware to maximally do C-19 jobs and I don't really care much about points. Yes, the combined GPU + CPU on my son's 2018 PC (6 cores * 2 threads) is shellacking my 2012 i7 iMac (4 cores x 2 threads) - no compatible GPU, by a wide margin ... but that's beside the interest in hoping we might find a vital clue to protecting us against SARS CoV2. In a nutshell: GPU's should get more points / hour, they are doing far more.
Re: Rationale for GPU WUs granting 20x more points than CPUs
Posted: Mon Apr 27, 2020 5:32 pm
by JimboPalmer
"Welcome to Whose Line is it Anyway, the show where everything's made up and the points don't matter"
I use points, they are a quick way to tell if all my PCs are running.
I use points, I can judge is this strategy yielding more points than that strategy. Points can approximate the value of the science I am doing.
I use points, I have for a month been judging how swamped the servers are/were, by the Points Per Day I can do.
However, in any big picture, the points don't matter. If you start thinking they do matter, you drift toward cheating, and no one wins.
CPU and GPUs I have an intel I5 with 4 cores and 8 way single precision SIMD running at 3 Ghz so it does 32 times 3 = 96 JimboFLOPS. (A made up value of Floating Point Operations Per Second)
I also have a Nvidia GTX1 060 with 1260 cores running at 1.5 Ghz or 1890 JimboFLOPS, so my Graphics card should get about 20 times what my CPU does, but the Quick Return Bonus favors quick, so it does even better.
The GPU really does 'brute force' the CPU's ability. The CPU does certain subtle things better, so still has value.
Re: Rationale for GPU WUs granting 20x more points than CPUs
Posted: Mon Apr 27, 2020 8:56 pm
by paulmd199
I happen to have one machine that does GPU only, because it's a quad core and it needs one core per GPU. This plus the remaining tasks the machine does makes the processor run at 90%, and adding the remaining core invites a bottleneck.
From what I understand about how base points are calculated, it is on an equal points for equal work basis. WUs, both CPU and GPUs are benchmarked on the CPU of one of FaH's own machines, a core i5, if the documentation is current.
EDIT:
From the FAQ
https://foldingathome.org/support/faq/points/
"Note that GPU projects are now being benchmarked on the same machine, but using that machine’s CPU. By using the same hardware, we want to preserve our goal of “equal pay for equal work”. Our GPU methods have advanced to the point such that, with GPU FAHCore 17, we can run any computation that we can do on the CPU on the GPU. Therefore we’ve unified the benchmarking scheme so that both GPU and CPU projects use the same “yardstick”, which is our i5 benchmark CPU"
Re: Rationale for GPU WUs granting 20x more points than CPUs
Posted: Tue Apr 28, 2020 1:52 am
by MeeLee
It's a problem that's been brought forward multiple times already, and FAH will currently stick with it.
The benchmark procedure made a lot of sense back in the days of CPU folding.
You get points for a job, and you get bonus points for finishing the job faster.
Issue is, a modern CPU with 6-8 cores is no match for a GPU with 2000 to 4000 shaders (kind of like half-cores in a way).
This kind of processing power allowed WUs to be returned so quickly, that the majority of the points now earned are no longer base points, but QRB(Quick Return Bonus) related.
While some find this unfair, and it is unfair, it is the rewards system that FAH chose to apply and stick with.
Re: Rationale for GPU WUs granting 20x more points than CPUs
Posted: Tue Apr 28, 2020 4:26 am
by BP2020
Thanks all, I'm satisfied that it's been discussed and that there are opinions and a rationale; the topic is really interesting imho. I don't remember the last time I bought computer parts, might have been a PIII haahah, my rig is leftovers from a gamer friend so his donation make mine possible. The fact that with every WU I can contribute to some researcher "hitting the jackpot" gives me great satisfaction irrespective of points. But hey, I like points too!