Not new to folding, but new to dedicated folding.
Moderators: Site Moderators, FAHC Science Team
Not new to folding, but new to dedicated folding.
I have been folding for a few years now, before that I was running SETI@home. I've never messed with settings, just let BOINC do its thing. I'm looking in to building a dedicated folding machine to run 24/7. Over the past couple of days I have read a ton of info, but I am getting mixed results when it comes down to the best solution. I may have over looked something but never the less, I'm still confused.
I'm looking to spend about $5k CAN initially with the plans to keep growing the machine into a farm over time. What I'm confused on is if I should be going single multicore processor and multiple GPU's or should I go the Quad Xenon with a single GPU to start? I'm not worried about electricity costs, just PPD output.
I'm looking to spend about $5k CAN initially with the plans to keep growing the machine into a farm over time. What I'm confused on is if I should be going single multicore processor and multiple GPU's or should I go the Quad Xenon with a single GPU to start? I'm not worried about electricity costs, just PPD output.
Re: Not new to folding, but new to dedicated folding.
Long story short, if PPD output is your thing you should focus on GPUs with best PPD/dollar. That said, you'd still need an appropriate PSU/MB/CPU to drive them and good case/cooling. With your budget I'd go for Haswell-E, X99 MB, 4x 980... Of course that's already maxing such a system from day one - you can also go to a dual (or more) CPU Xeon now and add GPUs later on, but your PPD at the beginning will be much less. Going forward, cost/PPD-wise I think building a rack of multiple machines (such as the example I gave) would be more effective than investing in Xeon solutions.
Windows 11 x64 / 5800X@5Ghz / 32GB DDR4 3800 CL14 / 4090 FE / Creative Titanium HD / Sennheiser 650 / PSU Corsair AX1200i
Re: Not new to folding, but new to dedicated folding.
The best way of spending the money would be to buy more than one rig.
I would recommend you to buy a motherboard supports PCIe 3.0 x8/8 and run two GPU on each rig.
1150 is more than enough if you want most PPD pr $ and watt
2 x GTX 970 will end up giving about 550k PPD in total.
MSI is quiet when it comes to noise. If you mind noise I would recommend those.
Here are an example of hardware to buy:
http://www.amazon.com/MSI-GTX-970-4G-Gr ... Q5SCW267XK
http://www.amazon.com/Intel-Pentium-Pro ... words=1150
http://www.amazon.com/MSI-Motherboards- ... words=1150
http://www.amazon.com/Corsair-Professio ... s=platinum
http://www.amazon.com/OCZ-Solutions-2-5 ... ywords=ssd
http://www.amazon.com/gp/product/B00J8E ... PDKIKX0DER
I would recommend you to buy a motherboard supports PCIe 3.0 x8/8 and run two GPU on each rig.
1150 is more than enough if you want most PPD pr $ and watt
2 x GTX 970 will end up giving about 550k PPD in total.
MSI is quiet when it comes to noise. If you mind noise I would recommend those.
Here are an example of hardware to buy:
http://www.amazon.com/MSI-GTX-970-4G-Gr ... Q5SCW267XK
http://www.amazon.com/Intel-Pentium-Pro ... words=1150
http://www.amazon.com/MSI-Motherboards- ... words=1150
http://www.amazon.com/Corsair-Professio ... s=platinum
http://www.amazon.com/OCZ-Solutions-2-5 ... ywords=ssd
http://www.amazon.com/gp/product/B00J8E ... PDKIKX0DER
-
- Posts: 704
- Joined: Tue Dec 04, 2007 6:56 am
- Hardware configuration: Ryzen 7 5700G, 22.40.46 VGA driver; 32GB G-Skill Trident DDR4-3200; Samsung 860EVO 1TB Boot SSD; VelociRaptor 1TB; MSI GTX 1050ti, 551.23 studio driver; BeQuiet FM 550 PSU; Lian Li PC-9F; Win11Pro-64, F@H 8.3.5.
[Suspended] Ryzen 7 3700X, MSI X570MPG, 32GB G-Skill Trident Z DDR4-3600; Corsair MP600 M.2 PCIe Gen4 Boot, Samsung 840EVO-250 SSDs; VelociRaptor 1TB, Raptor 150; MSI GTX 1050ti, 526.98 driver; Kingwin Stryker 500 PSU; Lian Li PC-K7B. Win10Pro-64, F@H 8.3.5. - Location: @Home
- Contact:
Re: Not new to folding, but new to dedicated folding.
Multi-CPU Xeon motherboards are expensive, and they will not allow you to add in any more GPUs than a single-CPU rig. I agree with Breach that the best current system would be an X99 MoBo with a CPU that supports 40 PCIe lanes (5960 or 5930). That way you might be able to put in as many as 5 GPUs (2 dual-GPU and a single GPU card), each at x8 bandwidth. You'll need a case that will support radiators for liquid cooling of at least the CPU, and maybe the GPUs as well.
I'm not a power-Folder, so specific GPU performance and configuration is not in my scan. You may be limited by power requirements of the high-power GPUs on a single PSU and electric circuit.
I'm not a power-Folder, so specific GPU performance and configuration is not in my scan. You may be limited by power requirements of the high-power GPUs on a single PSU and electric circuit.
Ryzen 7 5700G, 22.40.46 VGA driver; MSI GTX 1050ti, 551.23 studio driver
Ryzen 7 3700X; MSI GTX 1050ti, 551.23 studio driver [Suspended]
Ryzen 7 3700X; MSI GTX 1050ti, 551.23 studio driver [Suspended]
Re: Not new to folding, but new to dedicated folding.
Thanks for the info guys, I appreciate it!
Re: Not new to folding, but new to dedicated folding.
Heat is a problem when folding 4 gpus on the same system. If you put them under water, it will cost way more than you gain.jrweiss wrote:Multi-CPU Xeon motherboards are expensive, and they will not allow you to add in any more GPUs than a single-CPU rig. I agree with Breach that the best current system would be an X99 MoBo with a CPU that supports 40 PCIe lanes (5960 or 5930). That way you might be able to put in as many as 5 GPUs (2 dual-GPU and a single GPU card), each at x8 bandwidth. You'll need a case that will support radiators for liquid cooling of at least the CPU, and maybe the GPUs as well.
I'm not a power-Folder, so specific GPU performance and configuration is not in my scan. You may be limited by power requirements of the high-power GPUs on a single PSU and electric circuit.
And of some reason the PPD seems to drop when you put more than one card in a system.
One GTX 970 = 300k PPD
Two GTX 970 = 275k PPD pr card
Four GTX 970 = 250k PPD pr card.
I would go for separate systems like I have showed you before in this tread.
It will also be more flexible when it comes to down time. But most of all you will get rid of the heat more efficient.
-
- Posts: 177
- Joined: Tue Aug 26, 2014 9:48 pm
- Hardware configuration: 10 SMP folding slots on Intel Phi "Knights Landing" system, configured as 24 CPUs/slot
9 AMD GPU folding slots
31 Nvidia GPU folding slots
50 total folding slots
Average PPD/slot = 459,500 - Location: Dallas, TX
Re: Not new to folding, but new to dedicated folding.
You may want to consider starting with two separate systems to build a foundation upon which you can build on both over time.
The key to dense folding rigs is to optimize around physical slots (PCIe) available in a configuration that allows heat to be extracted/exhausted efficiently without impacting performance. An 1150 CPU, like an i7-4790K on a EVGA Z97 Classified (152-HR-E979-KR ) motherboard has 5 PCIe 3.0 x16 slots nicely spaced so 3 GPUs can be hosted without undue heat build up by using physical PCIe slots 1, 4 and 7. Granted the 1150 only has 16 PCIe lanes, but with the PLX PCIe switch chip, the effective number of lanes are expanded 32 with very minor latency impacting performance. In this configuration using GTX 980's, the total production is between 870,000 to 990,000 PPD (depending on work unit) without over-clocking the GPUs. You can start with one 980 and add others over time. I run this exact configuration in one of my systems.
You could build a second 1150 based on the same chip and motherboard, but instead of Nvidia, you might consider using an AMD/ATI solution to mitigate risk on choosing to be only AMD or only Nvidia. Using a Corsair AXi 1500 power supply on a dedicated 120v, 20 amp circuit, you can place three R9-295x GPUs (6 total GPUs) on the platform and get ~1,440,000 PPD with a small over clock (1.030 vs 1.018 GHz stock) on Core x17 WU (note, currently, that Core x18 does not perform as well on AMD/ATI). The power draw is significant at around 1400 watts continuous, so your power bill will sure notice it if folding 24 x 7. I have this solution working in a very heavily modified Chenbro RM41300-FS81 case in a rack-mount configuration dedicated to GPU folding (i.e. no SMP configured since a small CPU cooler is required to fit the R9 radiators in the case and to use PCIe slot 1 on the motherboard). Again, you could start with one R9-295x and build from there.
Note that with each of these solutions, I use Process Lasso to push all Windows system and utilities applications including and FAH Control and FAH Client and wrappers on to CPU cores 0 through 3 and dedicate all GPU cores (x15, x17, x18) to CPU cores 4 through 7. Keeping the CPU cores servicing the GPU cores "quiet" has a big improvement on GPU performance by reducing the latency required to service the GPU interrupt(s). That latency reduction is one of the reasons you can use an 1150-based motherboard with a PLX chip and extract performance very close to a more expensive 2011-v3 motherboard (and DD4 memory!) with 40 PCIe lanes.
I don't take sides on whether or not to use AMD or Nvidia, just that I believe in mitigating the risk of FAH software that can sometimes preference performance of one GPU type over another, and generally have split my GPUs almost 50/50 between the two manufacturers.
The key to dense folding rigs is to optimize around physical slots (PCIe) available in a configuration that allows heat to be extracted/exhausted efficiently without impacting performance. An 1150 CPU, like an i7-4790K on a EVGA Z97 Classified (152-HR-E979-KR ) motherboard has 5 PCIe 3.0 x16 slots nicely spaced so 3 GPUs can be hosted without undue heat build up by using physical PCIe slots 1, 4 and 7. Granted the 1150 only has 16 PCIe lanes, but with the PLX PCIe switch chip, the effective number of lanes are expanded 32 with very minor latency impacting performance. In this configuration using GTX 980's, the total production is between 870,000 to 990,000 PPD (depending on work unit) without over-clocking the GPUs. You can start with one 980 and add others over time. I run this exact configuration in one of my systems.
You could build a second 1150 based on the same chip and motherboard, but instead of Nvidia, you might consider using an AMD/ATI solution to mitigate risk on choosing to be only AMD or only Nvidia. Using a Corsair AXi 1500 power supply on a dedicated 120v, 20 amp circuit, you can place three R9-295x GPUs (6 total GPUs) on the platform and get ~1,440,000 PPD with a small over clock (1.030 vs 1.018 GHz stock) on Core x17 WU (note, currently, that Core x18 does not perform as well on AMD/ATI). The power draw is significant at around 1400 watts continuous, so your power bill will sure notice it if folding 24 x 7. I have this solution working in a very heavily modified Chenbro RM41300-FS81 case in a rack-mount configuration dedicated to GPU folding (i.e. no SMP configured since a small CPU cooler is required to fit the R9 radiators in the case and to use PCIe slot 1 on the motherboard). Again, you could start with one R9-295x and build from there.
Note that with each of these solutions, I use Process Lasso to push all Windows system and utilities applications including and FAH Control and FAH Client and wrappers on to CPU cores 0 through 3 and dedicate all GPU cores (x15, x17, x18) to CPU cores 4 through 7. Keeping the CPU cores servicing the GPU cores "quiet" has a big improvement on GPU performance by reducing the latency required to service the GPU interrupt(s). That latency reduction is one of the reasons you can use an 1150-based motherboard with a PLX chip and extract performance very close to a more expensive 2011-v3 motherboard (and DD4 memory!) with 40 PCIe lanes.
I don't take sides on whether or not to use AMD or Nvidia, just that I believe in mitigating the risk of FAH software that can sometimes preference performance of one GPU type over another, and generally have split my GPUs almost 50/50 between the two manufacturers.
Hardware config viewtopic.php?f=66&t=17997&p=277235#p277235
-
- Posts: 704
- Joined: Tue Dec 04, 2007 6:56 am
- Hardware configuration: Ryzen 7 5700G, 22.40.46 VGA driver; 32GB G-Skill Trident DDR4-3200; Samsung 860EVO 1TB Boot SSD; VelociRaptor 1TB; MSI GTX 1050ti, 551.23 studio driver; BeQuiet FM 550 PSU; Lian Li PC-9F; Win11Pro-64, F@H 8.3.5.
[Suspended] Ryzen 7 3700X, MSI X570MPG, 32GB G-Skill Trident Z DDR4-3600; Corsair MP600 M.2 PCIe Gen4 Boot, Samsung 840EVO-250 SSDs; VelociRaptor 1TB, Raptor 150; MSI GTX 1050ti, 526.98 driver; Kingwin Stryker 500 PSU; Lian Li PC-K7B. Win10Pro-64, F@H 8.3.5. - Location: @Home
- Contact:
Re: Not new to folding, but new to dedicated folding.
Also, Maximum PC Magazine (http://www.maximumpc.com/) has built and reported on several high-density, multi-GPU rigs in the past year. You might search their site for specific ideas on cooling setups. There are now closed-loop coolers available for GPUs, and they may be of interest to the OP.
Ryzen 7 5700G, 22.40.46 VGA driver; MSI GTX 1050ti, 551.23 studio driver
Ryzen 7 3700X; MSI GTX 1050ti, 551.23 studio driver [Suspended]
Ryzen 7 3700X; MSI GTX 1050ti, 551.23 studio driver [Suspended]
-
- Posts: 27
- Joined: Fri Aug 08, 2008 4:15 am
- Hardware configuration: Toshiba X205-SLi1 using nVidia CUDA drivers version 177.35
Re: Not new to folding, but new to dedicated folding.
If it were my money I would buy two or three systems each with a socket 1150 i7 CPU and two Nvidia 980s then install Linux for the OS. I would not overclock anything because that means more heat which means a greater chance of being throttled because of hitting the TDP for each device. Overclocking/overheating could also mean a shorter life for said components. I would also at least look into UPS systems and other forms of power protection. By having multiple systems with identical parts you also minimize the effects of device failure i.e. if one motherboard/PSU/etc fails, you can still fold on the other systems. My personal opinion is that 1,000,000 PPD for a year is not nearly as good as 500,000 PPD for five years.
EDIT: http://www.anandtech.com uses F@H as part of their benchmark suite for PC components.
I hope that helps,
E
EDIT: http://www.anandtech.com uses F@H as part of their benchmark suite for PC components.
I hope that helps,
E
"In Theory there is no difference between Theory and Practice. In Practice there is."
-
- Posts: 410
- Joined: Mon Nov 15, 2010 8:51 pm
- Hardware configuration: 8x GTX 1080
3x GTX 1080 Ti
3x GTX 1060
Various other bits and pieces - Location: South Coast, UK
Re: Not new to folding, but new to dedicated folding.
Same opinion here, except I'd even scale down to pentiums/celerons instead of using i7s to save more money to put towards another rig. Two cores will feed a pair of GPUs no problem and use less power. You miss out on CPU folding, but that will only be around 30k PPD. Also, aim for decent 600W gold or better PSUs - even if electricity costs aren't a concern, it helps to save on heat and resources!everyman wrote:If it were my money I would buy two or three systems each with a socket 1150 i7 CPU and two Nvidia 980s then install Linux for the OS. I would not overclock anything because that means more heat which means a greater chance of being throttled because of hitting the TDP for each device. Overclocking/overheating could also mean a shorter life for said components. I would also at least look into UPS systems and other forms of power protection. By having multiple systems with identical parts you also minimize the effects of device failure i.e. if one motherboard/PSU/etc fails, you can still fold on the other systems. My personal opinion is that 1,000,000 PPD for a year is not nearly as good as 500,000 PPD for five years.
It might be awesome to have a single quad GPU powerhouse putting out 1.5M PPD but by using an X99 board, DDR4 and i7 processors (+ probably watercooling) you're paying a premium that will probably turn out more than 3 dual GPU systems that combined will do 2M+ PPD
Hope that helps