32 cores on Debian - Is this right?
Moderators: Site Moderators, FAHC Science Team
-
- Posts: 10179
- Joined: Thu Nov 29, 2007 4:30 pm
- Hardware configuration: Intel i7-4770K @ 4.5 GHz, 16 GB DDR3-2133 Corsair Vengence (black/red), EVGA GTX 760 @ 1200 MHz, on an Asus Maximus VI Hero MB (black/red), in a blacked out Antec P280 Tower, with a Xigmatek Night Hawk (black) HSF, Seasonic 760w Platinum (black case, sleeves, wires), 4 SilenX 120mm Case fans with silicon fan gaskets and silicon mounts (all black), a 512GB Samsung SSD (black), and a 2TB Black Western Digital HD (silver/black).
- Location: Arizona
- Contact:
Re: 32 cores on Debian - Is this right?
Free is the right price.
How to provide enough information to get helpful support
Tell me and I forget. Teach me and I remember. Involve me and I learn.
Tell me and I forget. Teach me and I remember. Involve me and I learn.
Re: 32 cores on Debian - Is this right?
The Tesla GPUs are gradually being phased out. (They're pretty slow compared to later generations of GPUs.) You may be able to use them temporarily if the Linux drivers work as they should. For the Kepler/Fermi lines, the proprietary drivers are strongly recommended (mandatory?). Development is planning on supporting the Radeon line, too, if you happen to have AMD GPUs but it's not supported yet on Linux.
Posting FAH's log:
How to provide enough info to get helpful support.
How to provide enough info to get helpful support.
Re: 32 cores on Debian - Is this right?
Tesla are being phased out bruce?
The K20s should get you somewhere in the ballpark of the 780 - more 'thread processors' but at a lower frequency. Telsa's might actually allow more support with linux, because my understanding is that the Radeon Linux drivers provide less information than the professional drivers.
The high up-front cost of the professional cards make them less economic for F@H, but if the money has already been sunk then ongoing costs are likely to be similar to their retrospective cards (which in the case of the K20/K20X is their Titan/780(+ti) desktop counterparts).
The K20s should get you somewhere in the ballpark of the 780 - more 'thread processors' but at a lower frequency. Telsa's might actually allow more support with linux, because my understanding is that the Radeon Linux drivers provide less information than the professional drivers.
The high up-front cost of the professional cards make them less economic for F@H, but if the money has already been sunk then ongoing costs are likely to be similar to their retrospective cards (which in the case of the K20/K20X is their Titan/780(+ti) desktop counterparts).
Re: 32 cores on Debian - Is this right?
What's the difference between 780 and Titan's performance? Is the Titan worth the difference?k1wi wrote:Tesla are being phased out bruce?
The K20s should get you somewhere in the ballpark of the 780 - more 'thread processors' but at a lower frequency. Telsa's might actually allow more support with linux, because my understanding is that the Radeon Linux drivers provide less information than the professional drivers.
The high up-front cost of the professional cards make them less economic for F@H, but if the money has already been sunk then ongoing costs are likely to be similar to their retrospective cards (which in the case of the K20/K20X is their Titan/780(+ti) desktop counterparts).
Re: 32 cores on Debian - Is this right?
I mis-spoke. The G* chips are being phased out. The F* and K* chips are not.
I consider the Tesla overpriced for the home-market. It does bring better DP performance and more VRAM, but FAH is aimed at the *@home segment market. It doesn't use much VRAM, nor does it use DP, making the professional cards (mostly) unnecessary.
I doubt that the Titan is worth the difference but we'll know more after things settle out from the latest rounds of product announcements and price cuts. The Titan fits somewhere between the *home market and the Pro market so look carefully at things like SP performance and the cost of power -- not at DP or RAM (if you're only considering FAH. If you game or if you run professional CAD, make the appropriate adjustments to my recommendations.
I consider the Tesla overpriced for the home-market. It does bring better DP performance and more VRAM, but FAH is aimed at the *@home segment market. It doesn't use much VRAM, nor does it use DP, making the professional cards (mostly) unnecessary.
I doubt that the Titan is worth the difference but we'll know more after things settle out from the latest rounds of product announcements and price cuts. The Titan fits somewhere between the *home market and the Pro market so look carefully at things like SP performance and the cost of power -- not at DP or RAM (if you're only considering FAH. If you game or if you run professional CAD, make the appropriate adjustments to my recommendations.
Posting FAH's log:
How to provide enough info to get helpful support.
How to provide enough info to get helpful support.
-
- Posts: 10179
- Joined: Thu Nov 29, 2007 4:30 pm
- Hardware configuration: Intel i7-4770K @ 4.5 GHz, 16 GB DDR3-2133 Corsair Vengence (black/red), EVGA GTX 760 @ 1200 MHz, on an Asus Maximus VI Hero MB (black/red), in a blacked out Antec P280 Tower, with a Xigmatek Night Hawk (black) HSF, Seasonic 760w Platinum (black case, sleeves, wires), 4 SilenX 120mm Case fans with silicon fan gaskets and silicon mounts (all black), a 512GB Samsung SSD (black), and a 2TB Black Western Digital HD (silver/black).
- Location: Arizona
- Contact:
Re: 32 cores on Debian - Is this right?
They just dropped the price on the 780 down to $500 USD, which makes it more competative on on a PPD/$ basis. The Titan, at twice the price, does not fold at twice the speed.ekiro wrote:What's the difference between 780 and Titan's performance? Is the Titan worth the difference?
From an Anandtech review...

How to provide enough information to get helpful support
Tell me and I forget. Teach me and I remember. Involve me and I learn.
Tell me and I forget. Teach me and I remember. Involve me and I learn.
Re: 32 cores on Debian - Is this right?
Thanks for the info.7im wrote:They just dropped the price on the 780 down to $500 USD, which makes it more competative on on a PPD/$ basis. The Titan, at twice the price, does not fold at twice the speed.ekiro wrote:What's the difference between 780 and Titan's performance? Is the Titan worth the difference?
From an Anandtech review...
Yeah I noticed the prices of the 780s are around $500. That's perfect. I'll see what I can find.
How many WUs can 4x780s complete in a single day? Dedicating all system resources to it.
-
- Posts: 10179
- Joined: Thu Nov 29, 2007 4:30 pm
- Hardware configuration: Intel i7-4770K @ 4.5 GHz, 16 GB DDR3-2133 Corsair Vengence (black/red), EVGA GTX 760 @ 1200 MHz, on an Asus Maximus VI Hero MB (black/red), in a blacked out Antec P280 Tower, with a Xigmatek Night Hawk (black) HSF, Seasonic 760w Platinum (black case, sleeves, wires), 4 SilenX 120mm Case fans with silicon fan gaskets and silicon mounts (all black), a 512GB Samsung SSD (black), and a 2TB Black Western Digital HD (silver/black).
- Location: Arizona
- Contact:
Re: 32 cores on Debian - Is this right?
Why buy 780s when you have comparable Teslas?
WUs per day is not a good judge of speed. Work unit sizes vary quite a bit. Points Per Day is more accurate. As much as ~150K points per day per GPU, if you have the motherboard to support 4 times x16 slots.
Also not sure why the same GPU gets different results from Anand...
In the image above, the GTX 770 got 15.1 ns, but in the image below, it got 35.6 ns. Hmm?

WUs per day is not a good judge of speed. Work unit sizes vary quite a bit. Points Per Day is more accurate. As much as ~150K points per day per GPU, if you have the motherboard to support 4 times x16 slots.
Also not sure why the same GPU gets different results from Anand...
In the image above, the GTX 770 got 15.1 ns, but in the image below, it got 35.6 ns. Hmm?

How to provide enough information to get helpful support
Tell me and I forget. Teach me and I remember. Involve me and I learn.
Tell me and I forget. Teach me and I remember. Involve me and I learn.
Re: 32 cores on Debian - Is this right?
Yeah I don't know about those results for the 770s
One more question. What is the best hardware to dedicate for this type of computing?
One more question. What is the best hardware to dedicate for this type of computing?
Re: 32 cores on Debian - Is this right?
"Best" is rather subjective. Best PPD in terms of initial costs? Best PPD in terms of power costs? Best in terms of silence/noise? Best if you're also going to use it for gaming? Best if you have a large or a small monitor -- or multiple monitors?ekiro wrote:Yeah I don't know about those results for the 770s
One more question. What is the best hardware to dedicate for this type of computing?
By any of those critera, the best hardware changes with each release of hardware and to a lesser extent, with each revision of the drivers. Not to be flippant, but the best hardware is the hardware you already own ... at least if it's still working.
Look around the forum and you'll find dozens of topics where people have expressed their own opinions and they certainly do not agree with each other. Those topics will give you some things to think about, though.
The only recommendation that Stanford makes is that you don't buy hardware JUST for folding@home.
Posting FAH's log:
How to provide enough info to get helpful support.
How to provide enough info to get helpful support.
Re: 32 cores on Debian - Is this right?
Will the folding app automatically leverage all GPUs and CPUs in the system?
-
- Posts: 10179
- Joined: Thu Nov 29, 2007 4:30 pm
- Hardware configuration: Intel i7-4770K @ 4.5 GHz, 16 GB DDR3-2133 Corsair Vengence (black/red), EVGA GTX 760 @ 1200 MHz, on an Asus Maximus VI Hero MB (black/red), in a blacked out Antec P280 Tower, with a Xigmatek Night Hawk (black) HSF, Seasonic 760w Platinum (black case, sleeves, wires), 4 SilenX 120mm Case fans with silicon fan gaskets and silicon mounts (all black), a 512GB Samsung SSD (black), and a 2TB Black Western Digital HD (silver/black).
- Location: Arizona
- Contact:
Re: 32 cores on Debian - Is this right?
Yes. The installer will install both CPU and GPU slots to take advantage of the full system. However if you add hardware after the install you will have to manually add another slot to take advantage of additional hardware. Or you can run the installer again and it will automatically detect the change of hardware.ekiro wrote:Will the folding app automatically leverage all GPUs and CPUs in the system?
How to provide enough information to get helpful support
Tell me and I forget. Teach me and I remember. Involve me and I learn.
Tell me and I forget. Teach me and I remember. Involve me and I learn.
-
- Site Admin
- Posts: 8002
- Joined: Tue Apr 21, 2009 4:41 pm
- Hardware configuration: Mac Studio M1 Max 32 GB smp6
Mac Hack i7-7700K 48 GB smp4 - Location: W. MA
Re: 32 cores on Debian - Is this right?
The client software will only partially do that. Some assumptions based on prior experience with mixed CPU and GPU folding during the development process of the current version no longer hold with the newest GPU folding core.
The older Core_16 for AMD-ATI GPU's would use almost an entire CPU core while active, that would impact SMP folding on a CPU if all cores were also used for that as well. But Core_15 for nVidia GPU's used almost no CPU time while folding. So the client was set to reserve one core for GPU folding if a supported AMD-ATI card was detected. This applied for Windows as GPU folding was only supported on that OS.
With Core_17 being released that prior experience no longer holds. With this core AMD-ATI GPU's use CPU time at the beginning and then periodically every couple percent of progress performing a checkpoint and data validation. In between little CPU is used. However using Core_17 with a nVidia GPU uses most of a CPU core continuously due to the way the current drivers are written by nVidia. So the number of cores assigned to the CPU folding slot needs to be manually adjusted to get the best utilization.
Cards from both makers are supported with Windows, but so far just nVidia cards are supported for use with Linux. So keep that in mind when deciding what hardware and OS to use. More than one GPU in a system may take additional steps configuring for folding. If you have something specific in mind, ask here and those with experience with similar equipment can answer with what worked for them.
The older Core_16 for AMD-ATI GPU's would use almost an entire CPU core while active, that would impact SMP folding on a CPU if all cores were also used for that as well. But Core_15 for nVidia GPU's used almost no CPU time while folding. So the client was set to reserve one core for GPU folding if a supported AMD-ATI card was detected. This applied for Windows as GPU folding was only supported on that OS.
With Core_17 being released that prior experience no longer holds. With this core AMD-ATI GPU's use CPU time at the beginning and then periodically every couple percent of progress performing a checkpoint and data validation. In between little CPU is used. However using Core_17 with a nVidia GPU uses most of a CPU core continuously due to the way the current drivers are written by nVidia. So the number of cores assigned to the CPU folding slot needs to be manually adjusted to get the best utilization.
Cards from both makers are supported with Windows, but so far just nVidia cards are supported for use with Linux. So keep that in mind when deciding what hardware and OS to use. More than one GPU in a system may take additional steps configuring for folding. If you have something specific in mind, ask here and those with experience with similar equipment can answer with what worked for them.
Re: 32 cores on Debian - Is this right?
I have this build and I'd like to know what I need to do to take 100% advantage of all computing power available.
Ununtu 13.10 64-bit
ASUS Rampage IV Extreme
Intel i7 4930K
2x Kingston HyperX 16GB (4x4GB) DDR3 2400MHz Quad Channel
4x NVIDIA GeForce GTX 780
4x Samaung 840 Pro SSD RAID0
Roswill 1600W 80+ Silver PSU
--
Cool Master HAF 932
Cool Master Seidon 240M
Ununtu 13.10 64-bit
ASUS Rampage IV Extreme
Intel i7 4930K
2x Kingston HyperX 16GB (4x4GB) DDR3 2400MHz Quad Channel
4x NVIDIA GeForce GTX 780
4x Samaung 840 Pro SSD RAID0
Roswill 1600W 80+ Silver PSU
--
Cool Master HAF 932
Cool Master Seidon 240M
Re: 32 cores on Debian - Is this right?
You may (or may not) already be using 100% of the computing resources. We need to see the first couple pages of FAH's latest log file: /var/lib/fahclient/log.txt. (The one you posted 32 hrs ago doesn't match the hardware description that you just gave.)ekiro wrote:I have this build and I'd like to know what I need to do to take 100% advantage of all computing power available.
Which GPU drivers are installed?
Posting FAH's log:
How to provide enough info to get helpful support.
How to provide enough info to get helpful support.