I would guess the best ppd per Watt ratio would be achieved by second-generation Maxwells, but has anyone got any hard figures?
What about in the CPU arena? What's best, and is anything competitive with GPUs?
What hardware provides best ppd per Watt?
Moderator: Site Moderators
Forum rules
Please read the forum rules before posting.
Please read the forum rules before posting.
-
- Posts: 1164
- Joined: Wed Apr 01, 2009 9:22 pm
- Hardware configuration: Asus Z8NA D6C, 2 x5670@3.2 Ghz, , 12gb Ram, GTX 980ti, AX650 PSU, win 10 (daily use)
Asus Z87 WS, Xeon E3-1230L v3, 8gb ram, KFA GTX 1080, EVGA 750ti , AX760 PSU, Mint 18.2 OS
Not currently folding
Asus Z9PE- D8 WS, 2 E5-2665@2.3 Ghz, 16Gb 1.35v Ram, Ubuntu (Fold only)
Asus Z9PA, 2 Ivy 12 core, 16gb Ram, H folding appliance (fold only) - Location: Jersey, Channel islands
Re: What hardware provides best ppd per Watt?
can't help with gpu's but a pair of E5 v2 12core xeons running at 2.4ghz give me 350-500k ppd at 280w from the wall. That's 1250-1785 PPD/w depending on WU. Obviously this will change when BA end but for now I'm happy
Re: What hardware provides best ppd per Watt?
A new generation of GPUs comes out from time to time. Many of those new generations involve a die-shrink as the manufacturing techniques shift to a process that involves smaller, more closely packed transistors. As a general rule, this is responsible for a major reduction in power and/or an increase in PPD so in terms of PPD/w, choose the newest technology available, but be careful, since not all new generation GPUs involve new manufacturing technology.
Of course, the latest chips also demand a higher GPU cost and prices of older technology are cut, so total cost-of-operation involve both the initial cost and the long-term costs of power over the life of the hardware.
In all the examples I've seen, PPD/w for GPUs is better than PPD/w of CPUs, but since you have to have a CPU to drive the GPU, you might as well use it for a (small) increase in both power and PPD.
Personally, I'm very happy with my GTX 750 Ti (Maxwell Gen 1). Maxwell Gen 2 would be a good option, too, once the drivers (etc.) get resolved.
Of course, the latest chips also demand a higher GPU cost and prices of older technology are cut, so total cost-of-operation involve both the initial cost and the long-term costs of power over the life of the hardware.
In all the examples I've seen, PPD/w for GPUs is better than PPD/w of CPUs, but since you have to have a CPU to drive the GPU, you might as well use it for a (small) increase in both power and PPD.
Personally, I'm very happy with my GTX 750 Ti (Maxwell Gen 1). Maxwell Gen 2 would be a good option, too, once the drivers (etc.) get resolved.
Posting FAH's log:
How to provide enough info to get helpful support.
How to provide enough info to get helpful support.
Re: What hardware provides best ppd per Watt?
I don't think there are significant differences between "first" and "second" generation Maxwells; it is mainly just the number of shaders and corresponding circuitry. But the quick-return bonus will favor the larger cards, other things being equal.Dan_G wrote:I would guess the best ppd per Watt ratio would be achieved by second-generation Maxwells, but has anyone got any hard figures?
As for my GTX 750 Tis, they yield about 66k PPD for 41 watts for the Core_17 P9201 work units (as measure by the TDP% in GPU-Z). I don't think that accounts for the static power, which is about 5 watts, so it is about 46 watts total. And what you draw from the power line of course depends on the efficiency of your power supply. So that is 1435 PPD/watt for the card itself.
As for the Core_15 work units, it is 30k PPD (P7622) for a total of 56 watts, or 536 PPD/watt.
The real question is what happens with Core_18? The PPD will apparently be low at first, but might (who knows?) improve later. I would keep my powder dry for a while.
-
- Posts: 110
- Joined: Thu Apr 30, 2009 7:31 pm
- Hardware configuration: i7-3930K@4.1GHz
GTX680@1.275GHz
Q9300@2.4GHz
GTX460@800MHz - Location: Essen, Germany
Re: What hardware provides best ppd per Watt?
Windows: GTX 980@1500MHz, 297k PPD (P9201), ~175W => 1700 PPD/W
Linux: GTX 980@1500MHz, reportedly ~360k PPD (P9201), ~175W (?) => 2050 PPD/W
Heiko
Linux: GTX 980@1500MHz, reportedly ~360k PPD (P9201), ~175W (?) => 2050 PPD/W
Heiko
Re: What hardware provides best ppd per Watt?
Jimf,As for my GTX 750 Tis, they yield about 66k PPD for 41 watts for the Core_17 P9201 work units (as measure by the TDP% in GPU-Z). I don't think that accounts for the static power, which is about 5 watts, so it is about 46 watts total. And what you draw from the power line of course depends on the efficiency of your power supply. So that is 1435 PPD/watt for the card itself.
I love my GTX 750 TI and have it running on an old Core 2 Duo. I think that one of the things that needs to be considered is the total system power usage when folding certain types of units. When I fold x'17 units, fahcore.exe uses 50% of my CPU resources. x'15 units don't have this behavior. I haven't put the Kill A Watt on it to get hard numbers, but I'm pretty sure that the total consumption is significantly higher than just the card because of this behavior. My guess would be that the behavior for x'17 units will change once the video driver problems are resolved by Nvidia.
Re: What hardware provides best ppd per Watt?
Nert,
Yes, I have seen that behavior too. The "50%" is actually 100% of one of the two cores. I think the fact that the Core_17 work units use 100% of a core is because of how OpenCL is implemented on the Nvidia cards (differently than on the AMD cards by the way), and there seems to be nothing the Folding developers can do about it. But I don't think the core is really being used that heavily in spite of the CPU% measurement, and the CPU power won't add much to the total, though the GTX 750 Ti is so efficient it would be a larger percentage increase than with the older cards.
Yes, I have seen that behavior too. The "50%" is actually 100% of one of the two cores. I think the fact that the Core_17 work units use 100% of a core is because of how OpenCL is implemented on the Nvidia cards (differently than on the AMD cards by the way), and there seems to be nothing the Folding developers can do about it. But I don't think the core is really being used that heavily in spite of the CPU% measurement, and the CPU power won't add much to the total, though the GTX 750 Ti is so efficient it would be a larger percentage increase than with the older cards.