Points CPU vs GPU
Moderators: Site Moderators, FAHC Science Team
-
- Site Admin
- Posts: 7939
- Joined: Tue Apr 21, 2009 4:41 pm
- Hardware configuration: Mac Pro 2.8 quad 12 GB smp4
MacBook Pro 2.9 i7 8 GB smp2 - Location: W. MA
Re: Points CPU vs GPU
One other metric that you can look at, compare the total number of "steps" listed in the log for CPU and GPU WUs and compare them. A typical CPU Wu has 250k or 500k steps, a GPU Wu might have 10x that. Usually those steps are the same length of time, 2 fs if I recall the scale correctly. Though there has been some work to increase the length of the time step to twice that for some GPU projects.
Edit: corrected length of time for a step, was 2 ps corrected to 2 femtoseconds.
Edit: corrected length of time for a step, was 2 ps corrected to 2 femtoseconds.
iMac 2.8 i7 12 GB smp8, Mac Pro 2.8 quad 12 GB smp6
MacBook Pro 2.9 i7 8 GB smp3
-
- Posts: 52
- Joined: Sat Mar 28, 2020 1:22 am
Re: Points CPU vs GPU
I am aware a cpu does fewer flops. However, it does work that a gpu literally cannot do.JimboPalmer wrote:They still do not do as much science, measured as FLOPS as a GPU.Endgame124 wrote:But GPUs have a 0 throughout on CPU only WUs, so CPUs are infinitely higher in throughout on those WUs.Neil-B wrote:Because Points do not reward importance in any way … Points are awarded as a "measure" of scientific throughput with a loading for Quick Return … GPUs have a higher throughput rate - simples.
https://en.wikipedia.org/wiki/FLOPS
Folding at home has defined a “value” to work units, and value is points. Generally speaking, A CPU of a certain generation will get 10x less points than a GPU of the same generation. This is F@H telling us that a GPU is 10x more valuable to them than a CPU, and the best thing you can do for the project is to have your CPUs run video cards.
Now, I don’t think it is necessarily correct to say that a CPU is 10x less valuable than a GPU since over half the work appears to be CPU only, but until the points are changed it’s hard to argue with the story told by the points.
-
- Posts: 946
- Joined: Sun Dec 16, 2007 6:22 pm
- Hardware configuration: 7950x3D, 5950x, 5800x3D, 3900x
7900xtx, Radeon 7, 5700xt, 6900xt, RX 550 640SP - Location: London
- Contact:
Re: Points CPU vs GPU
Points are given to value set hardware for set simulation.
Points will not be given for average value of set hardware throughout the whole FAH project.
If the we started valuing hardware on how in general it is valuable, you end up with ATI 1800 series as most valuable hardware ever, since that kickstarted the whole GP folding. So shall we go back and recalculate all the points earned for users with that hardware?
This is extremely weird discussion going on, and I cannot believe it is happening
Points show hardware value per given project.
Some CPUs are much faster core for core than others
Some GPUs are 10x faster than CPUs
Some GPUs are on 2x faster than fastest consumer CPU
Some GPUs will not fold at all, some will only fold certain projects, some will not fold projects available for CPU
Some CPUs will not fold at all, some will fold only certain projects, some will not fold projects available for GPUs
Some people fold for greater good of humanity, or science, some people I guess worry, that useless point system is somehow unfair to a CPU
Points will not be given for average value of set hardware throughout the whole FAH project.
If the we started valuing hardware on how in general it is valuable, you end up with ATI 1800 series as most valuable hardware ever, since that kickstarted the whole GP folding. So shall we go back and recalculate all the points earned for users with that hardware?
This is extremely weird discussion going on, and I cannot believe it is happening
Points show hardware value per given project.
Some CPUs are much faster core for core than others
Some GPUs are 10x faster than CPUs
Some GPUs are on 2x faster than fastest consumer CPU
Some GPUs will not fold at all, some will only fold certain projects, some will not fold projects available for CPU
Some CPUs will not fold at all, some will fold only certain projects, some will not fold projects available for GPUs
Some people fold for greater good of humanity, or science, some people I guess worry, that useless point system is somehow unfair to a CPU
FAH Omega tester
-
- Posts: 2522
- Joined: Mon Feb 16, 2009 4:12 am
- Location: Greenwood MS USA
Re: Points CPU vs GPU
I know it used to. I am unaware of what Core_a7 does that Core_22 can't do. Please explain.Endgame124 wrote:I am aware a cpu does fewer flops. However, it does work that a gpu literally cannot do.
Tsar of all the Rushers
I tried to remain childlike, all I achieved was childish.
A friend to those who want no friends
I tried to remain childlike, all I achieved was childish.
A friend to those who want no friends
-
- Posts: 946
- Joined: Sun Dec 16, 2007 6:22 pm
- Hardware configuration: 7950x3D, 5950x, 5800x3D, 3900x
7900xtx, Radeon 7, 5700xt, 6900xt, RX 550 640SP - Location: London
- Contact:
Re: Points CPU vs GPU
It can't magically convert between each other.JimboPalmer wrote:I know it used to. I am unaware of what Core_a7 does that Core_22 can't do. Please explain.Endgame124 wrote:I am aware a cpu does fewer flops. However, it does work that a gpu literally cannot do.
FAH Omega tester
-
- Posts: 42
- Joined: Sat Apr 18, 2020 12:48 am
- Hardware configuration: AMD 5700x, Asus Prime X570-Pro, EVGA 3080 12gb, G.SKILL Aegis 16GB DDR4 3200, Windows 10
Re: Points CPU vs GPU
Sorry I ever asked the question seemed like a logical question when looking at the return on investment of a cpu and gpu on a system i was building for folding. Points might be useless but it is the data that is given.muziqaz wrote:Points are given to value set hardware for set simulation.
Points will not be given for average value of set hardware throughout the whole FAH project.
If the we started valuing hardware on how in general it is valuable, you end up with ATI 1800 series as most valuable hardware ever, since that kickstarted the whole GP folding. So shall we go back and recalculate all the points earned for users with that hardware?
This is extremely weird discussion going on, and I cannot believe it is happening
Points show hardware value per given project.
Some CPUs are much faster core for core than others
Some GPUs are 10x faster than CPUs
Some GPUs are on 2x faster than fastest consumer CPU
Some GPUs will not fold at all, some will only fold certain projects, some will not fold projects available for CPU
Some CPUs will not fold at all, some will fold only certain projects, some will not fold projects available for GPUs
Some people fold for greater good of humanity, or science, some people I guess worry, that useless point system is somehow unfair to a CPU
Last edited by skydivingcatfan on Wed May 06, 2020 8:09 pm, edited 1 time in total.
-
- Posts: 52
- Joined: Sat Mar 28, 2020 1:22 am
Re: Points CPU vs GPU
What GPU can load A7 again?JimboPalmer wrote:I know it used to. I am unaware of what Core_a7 does that Core_22 can't do. Please explain.Endgame124 wrote:I am aware a cpu does fewer flops. However, it does work that a gpu literally cannot do.
-
- Posts: 946
- Joined: Sun Dec 16, 2007 6:22 pm
- Hardware configuration: 7950x3D, 5950x, 5800x3D, 3900x
7900xtx, Radeon 7, 5700xt, 6900xt, RX 550 640SP - Location: London
- Contact:
Re: Points CPU vs GPU
So I am trying to answer as logically as possible.skydivingcatfan wrote:Sorry I ever asked the question seemed like a logical question when looking at the return on investment of a cpu and gpu on a system i was building for folding. Points might be useless but it is the data that is given.muziqaz wrote:Points are given to value set hardware for set simulation.
Points will not be given for average value of set hardware throughout the whole FAH project.
If the we started valuing hardware on how in general it is valuable, you end up with ATI 1800 series as most valuable hardware ever, since that kickstarted the whole GP folding. So shall we go back and recalculate all the points earned for users with that hardware?
This is extremely weird discussion going on, and I cannot believe it is happening
Points show hardware value per given project.
Some CPUs are much faster core for core than others
Some GPUs are 10x faster than CPUs
Some GPUs are on 2x faster than fastest consumer CPU
Some GPUs will not fold at all, some will only fold certain projects, some will not fold projects available for CPU
Some CPUs will not fold at all, some will fold only certain projects, some will not fold projects available for GPUs
Some people fold for greater good of humanity, or science, some people I guess worry, that useless point system is somehow unfair to a CPU
Logical question would be: Why is my cpu getting less points than other person's CPU on p12345? or Why is my GPU getting less point than other person's GPU.
This way you would compare oranges vs oranges.
Instead you are trying to suggest to equalise GPU points versus CPUs just because you are not getting any GPU work. You have been told, that the reason you are not getting any GPU work, or you are getting more CPU work is because majority of researchers are used to work with Gromacs. With latest versions of core_22 it can do almost everything what fahcore_a7 can do, and it can do it a lot faster, not just a little bit faster but a lot faster. The faster set hardware finishes the job, and the bigger that WU is, the more points it will get you.
FAH Omega tester
-
- Posts: 2522
- Joined: Mon Feb 16, 2009 4:12 am
- Location: Greenwood MS USA
Re: Points CPU vs GPU
I am sorry you misread my question. Lets try again. What science can Core_a7 do that Core_22 can't?Endgame124 wrote:What GPU can load A7 again?JimboPalmer wrote:I know it used to. I am unaware of what Core_a7 does that Core_22 can't do. Please explain.Endgame124 wrote:I am aware a cpu does fewer flops. However, it does work that a gpu literally cannot do.
Tsar of all the Rushers
I tried to remain childlike, all I achieved was childish.
A friend to those who want no friends
I tried to remain childlike, all I achieved was childish.
A friend to those who want no friends
-
- Posts: 52
- Joined: Sat Mar 28, 2020 1:22 am
Re: Points CPU vs GPU
It doesn’t matter when Core_22 can do when a7 work is producer at a equal or higher rate.JimboPalmer wrote:I am sorry you misread my question. Lets try again. What science can Core_a7 do that Core_22 can't?Endgame124 wrote: What GPU can load A7 again?
-
- Posts: 2522
- Joined: Mon Feb 16, 2009 4:12 am
- Location: Greenwood MS USA
Re: Points CPU vs GPU
If you have a 128 thread Zen 2 Threadripper running Linux, then Core_a7 is able to compete with a GPU running Core_22 in FLOPS, and so gets similar points.Endgame124 wrote:It doesn’t matter when Core_22 can do when a7 work is producer at a equal or higher rate.JimboPalmer wrote:What science can Core_a7 do that Core_22 can't?
https://www.amd.com/en/products/cpu/amd ... pper-3990x
So you have something to look forward to! I am retired, so will never be paid to use one. Rats.
Tsar of all the Rushers
I tried to remain childlike, all I achieved was childish.
A friend to those who want no friends
I tried to remain childlike, all I achieved was childish.
A friend to those who want no friends
-
- Posts: 42
- Joined: Sat Apr 18, 2020 12:48 am
- Hardware configuration: AMD 5700x, Asus Prime X570-Pro, EVGA 3080 12gb, G.SKILL Aegis 16GB DDR4 3200, Windows 10
Re: Points CPU vs GPU
I did not ask the question because I am not getting any GPU work units not sure where that came from. This came from looking at what is the best components to put into a folding system at my budget. My question is very logical and it is what I do at work all the time. What is the best return for the capital available.muziqaz wrote:
So I am trying to answer as logically as possible.
Logical question would be: Why is my cpu getting less points than other person's CPU on p12345? or Why is my GPU getting less point than other person's GPU.
This way you would compare oranges vs oranges.
Instead you are trying to suggest to equalise GPU points versus CPUs just because you are not getting any GPU work. You have been told, that the reason you are not getting any GPU work, or you are getting more CPU work is because majority of researchers are used to work with Gromacs. With latest versions of core_22 it can do almost everything what fahcore_a7 can do, and it can do it a lot faster, not just a little bit faster but a lot faster. The faster set hardware finishes the job, and the bigger that WU is, the more points it will get you.
I never figured post my observation that folding on my cpu cost twice has much folding on my gpu would get people upset.
-
- Posts: 2522
- Joined: Mon Feb 16, 2009 4:12 am
- Location: Greenwood MS USA
Re: Points CPU vs GPU
I do not feel I am upset, I merely question why it is F@H's fault you spent money the way you did. F@H assigns points based on the work it gets done.skydivingcatfan wrote:I never figured post my observation that folding on my cpu cost twice has much folding on my gpu would get people upset.
That is completely orthogonal to what the volunteers are spending. If you look around, you can find threads on bang for buck.
(At one time, one of my clients funded 55 PCs to fold as 'me', for my community theater I need 3 laptops to sell tickets that fold the other 335 days of the year, similarly I also judge BBQ about 6 weekends a year and the rest of the time they fold. My dedicated folders are 3 desktops, all with i3 CPUs and low end Nvidia cards: GTX1050ti, GTX1060, and GTX1650. My CPUs do 50k PPD, while my GPUs do 900k PPD) That keeps my computers busy, I try to help online to keep me busy.
Tsar of all the Rushers
I tried to remain childlike, all I achieved was childish.
A friend to those who want no friends
I tried to remain childlike, all I achieved was childish.
A friend to those who want no friends
-
- Posts: 42
- Joined: Sat Apr 18, 2020 12:48 am
- Hardware configuration: AMD 5700x, Asus Prime X570-Pro, EVGA 3080 12gb, G.SKILL Aegis 16GB DDR4 3200, Windows 10
Re: Points CPU vs GPU
I never blamed F@H for anything just made an observation and asked some questions.JimboPalmer wrote:I do not feel I am upset, I merely question why it is F@H's fault you spent money the way you did. F@H assigns points based on the work it gets done.skydivingcatfan wrote:I never figured post my observation that folding on my cpu cost twice has much folding on my gpu would get people upset.
That is completely orthogonal to what the volunteers are spending. If you look around, you can find threads on bang for buck.
(At one time, one of my clients funded 55 PCs to fold as 'me', for my community theater I need 3 laptops to sell tickets that fold the other 335 days of the year, similarly I also judge BBQ about 6 weekends a year and the rest of the time they fold. My dedicated folders are 3 desktops, all with i3 CPUs and low end Nvidia cards: GTX1050ti, GTX1060, and GTX1650. My CPUs do 50k PPD, while my GPUs do 900k PPD) That keeps my computers busy, I try to help online to keep me busy.
-
- Posts: 1996
- Joined: Sun Mar 22, 2020 5:52 pm
- Hardware configuration: 1: 2x Xeon E5-2697v3@2.60GHz, 512GB DDR4 LRDIMM, SSD Raid, Win10 Ent 20H2, Quadro K420 1GB, FAH 7.6.21
2: Xeon E3-1505Mv5@2.80GHz, 32GB DDR4, NVME, Win10 Pro 20H2, Quadro M1000M 2GB, FAH 7.6.21 (actually have two of these)
3: i7-960@3.20GHz, 12GB DDR3, SSD, Win10 Pro 20H2, GTX 750Ti 2GB, GTX 1080Ti 11GB, FAH 7.6.21 - Location: UK
Re: Points CPU vs GPU
Folding awards points for throughput of science with a QRB for speed - it in now way defines this as a value/importance - it is simply a metric and as such aligns loosely to flops with a modifier for speed ... so since cpu does fewer flops it gets fewer points.Endgame124 wrote: I am aware a cpu does fewer flops. However, it does work that a gpu literally cannot do.
Folding at home has defined a “value” to work units, and value is points. Generally speaking, A CPU of a certain generation will get 10x less points than a GPU of the same generation. This is F@H telling us that a GPU is 10x more valuable to them than a CPU, and the best thing you can do for the project is to have your CPUs run video cards.
Now, I don’t think it is necessarily correct to say that a CPU is 10x less valuable than a GPU since over half the work appears to be CPU only, but until the points are changed it’s hard to argue with the story told by the points.
FAH has not defined a "value" is has produce a metric - very different thing ... is not saying (and neither have I) that CPU WUs are less valuable - this is something that people who try to ascribe value/importance to points seem to want to do - Points have no value they are simply a measure of how much generic processing has been done.
Personally I really wish the competitive "my points are higher than your points" mentality that people seem desperate to promote didn't exist as it is in any ways an unhealthy approach to folding (and based on incorrect understanding of what points are).
If people have to then they could perhaps think of Points as fruit:
- GPU WUs could be apples from vast orchards with automatic tree shakers that produce many cheap apples
- CPU WUs could be dragon fruit from slower growing/less fruiting plants that have a more expensive more complex fruit
- Points are like your "five/ten a day" - it is simply a count of how many items of fruit you eat - not how expensive or how rare or how long they take to grow
... and as with Points I am sure there will be people who argue that a Dragon Fruit should count as more than one of ones "five/ten a day" because it is rarer, harder to grow, more expensive ... but in truth it is just one item of fruit !!
So, I'll say it again, indeed I'll shout it out once more in case it helps anyone understand ... "POINTS HAVE NO VALUE AND ARE NOT A MEASURE OF VALUE OR IMPORTANCE" ...
2x Xeon E5-2697v3, 512GB DDR4 LRDIMM, SSD Raid, W10-Ent, Quadro K420
Xeon E3-1505Mv5, 32GB DDR4, NVME, W10-Pro, Quadro M1000M
i7-960, 12GB DDR3, SSD, W10-Pro, GTX1080Ti
i9-10850K, 64GB DDR4, NVME, W11-Pro, RTX3070
(Green/Bold = Active)
Xeon E3-1505Mv5, 32GB DDR4, NVME, W10-Pro, Quadro M1000M
i7-960, 12GB DDR3, SSD, W10-Pro, GTX1080Ti
i9-10850K, 64GB DDR4, NVME, W11-Pro, RTX3070
(Green/Bold = Active)