CPU vs GPU point rewards
Posted: Sat Mar 14, 2020 10:33 pm
Hello,
I am new here, so this may seem like a silly question that has already been answered...
But I have tried searching the forum, but it says the keywords "points, cpu, gpu" are too common
So anyway, why are the points for CPU tasks much lower than points for GPU tasks?
Today I installed the client, started working on some tasks (although a bit dissatisfied, there were urging messages that you need new donors for COVID research, but I was not assigned any COVID task, my GPU was idling half of the day...)
And I noticed, that for a task, that ran ~6h on my CPU, I got 733 points, while for a task that ran ~3h on my GPU, I got 49k points.
You may argue, that GPU computing is better, does more job at he same time, so OK, I would accept that, but...
But the CPU task consumed ~40W of power (for 6hrs), while the GPU task consumed ~100W for 3hrs, (so it consumed ~20-50% more power, but yielded 6600% more points) which means that from the points per money perspective, the GPU task was much superior.
I mean, what is the point of the points? If I saw this only as a game, I would cancel the CPU task, because it is not profiting.
I saw in FAQ on your site, that the GPU tasks are benchmarked on the CPU as well, that does not make much sence, since the architecture is different, and it may take the CPU much much much more time to compute, but the GPU computes it faster WITHOUT using that "much much much more" energy.
Or, to take it from the other perspective - if the GPU tasks can be done on CPU in your benchmarks, why can't we run the CPU tasks on our GPUs?
My GPU's been idling half a day, saying the server does not have anything for it to do, while my CPU was consuming power...
What is the difference between those tasks? Are there some special functions, that the GPU would not handle? Wouldn't it make sense then to award the CPU computing with more points?
Thanks for the answer - I am much interested in the computer-sciency answer as I try to understand the process.
Martin
I am new here, so this may seem like a silly question that has already been answered...
But I have tried searching the forum, but it says the keywords "points, cpu, gpu" are too common
So anyway, why are the points for CPU tasks much lower than points for GPU tasks?
Today I installed the client, started working on some tasks (although a bit dissatisfied, there were urging messages that you need new donors for COVID research, but I was not assigned any COVID task, my GPU was idling half of the day...)
And I noticed, that for a task, that ran ~6h on my CPU, I got 733 points, while for a task that ran ~3h on my GPU, I got 49k points.
You may argue, that GPU computing is better, does more job at he same time, so OK, I would accept that, but...
But the CPU task consumed ~40W of power (for 6hrs), while the GPU task consumed ~100W for 3hrs, (so it consumed ~20-50% more power, but yielded 6600% more points) which means that from the points per money perspective, the GPU task was much superior.
I mean, what is the point of the points? If I saw this only as a game, I would cancel the CPU task, because it is not profiting.
I saw in FAQ on your site, that the GPU tasks are benchmarked on the CPU as well, that does not make much sence, since the architecture is different, and it may take the CPU much much much more time to compute, but the GPU computes it faster WITHOUT using that "much much much more" energy.
Or, to take it from the other perspective - if the GPU tasks can be done on CPU in your benchmarks, why can't we run the CPU tasks on our GPUs?
My GPU's been idling half a day, saying the server does not have anything for it to do, while my CPU was consuming power...
What is the difference between those tasks? Are there some special functions, that the GPU would not handle? Wouldn't it make sense then to award the CPU computing with more points?
Thanks for the answer - I am much interested in the computer-sciency answer as I try to understand the process.
Martin