PCI-e bandwidth/capacity limitations

A forum for discussing FAH-related hardware choices and info on actual products (not speculation).

Moderator: Site Moderators

Forum rules
Please read the forum rules before posting.
NGBRO
Posts: 12
Joined: Mon Apr 08, 2013 10:49 am

Re: PCI-e bandwidth/capacity limitations

Post by NGBRO »

Aurum wrote:Have you watched it run with Windows/Task Manager/Performance to see if Q9400 is maxing out :?:
Yeah, only used 20-30% CPU while chugging with only the GPU.
Aurum
Posts: 292
Joined: Sat Oct 03, 2015 3:15 pm
Location: The Great Basin

Re: PCI-e bandwidth/capacity limitations

Post by Aurum »

NGBRO wrote:I recently got a GTX 1060 Mini 3GB to try folding full-time. I installed in a slapped-together system with a Q9400 and I see that the GPU is running at PCIe 1.1 x16, which seems to be the max supported by the mobo and CPU.
Here's your MB: https://support.hp.com/nz-en/document/c01357119
Last edited by Aurum on Sun Jun 18, 2017 3:27 pm, edited 1 time in total.
In Science We Trust Image
Aurum
Posts: 292
Joined: Sat Oct 03, 2015 3:15 pm
Location: The Great Basin

Re: PCI-e bandwidth/capacity limitations

Post by Aurum »

Aurum wrote:There's some pretty WUs running now, maybe you caught one of those.
x16 1.0 ~ x8 2.0 ~ x4 3.0 so you may be taking a bite out of your 1060 PPD.
I've got a rig with four 1060 6GB cards at x16 2.0, x8 2.0, x8 2.0 & x1 2.0.
PPD ranges: 348477, 331888, 266147 and 206888.
Update after all four cards have moved on to new WUs:
380905, 358867, 265628 and 101697

NGBRO, Your x16 1.1 has 8 times the throughput of my x1 2.0 so I think it should do just fine with a 1060. Give it time to work on several WUs.

BTW, Look at Processes in the Task Manager and see if something is hogging memory. E.g., FAHControl has a memory leak and if you leave it open it will grow to over a GB before you know it. I never leave mine open any more.
In Science We Trust Image
NGBRO
Posts: 12
Joined: Mon Apr 08, 2013 10:49 am

Re: PCI-e bandwidth/capacity limitations

Post by NGBRO »

Aurum wrote:Here's your MB
I know, just that they didn't state the PCIe gen on there. I got my PCIe 1.1 x16 info as GPU-Z.
Aurum wrote:NGBRO, Your x16 1.1 has 8 times the throughput of my x1 2.0 so I think it should do just fine with a 1060. Give it time to work on several WUs.

BTW, Look at Processes in the Task Manager and see if something is hogging memory. E.g., FAHControl has a memory leak and if you leave it open it will grow to over a GB before you know it. I never leave mine open any more.
Mine is the 3GB variant with less CUDA cores. What is the expected and avg ballpark of PPD for that instead? Btw I didn't run out of RAM when I checked occasionally. How would it affect anyway?

EDIT: Moved the system to an open place and removed the side panel. Now it'd max out ~75degC and stayed at 1835Mhz, TDP is 80-98%. PPD so far is ~285K. CPU usage for fahcore is 25% (which is technically 1/4 of my cores) and RAM maxes at ~450MB for fahcore.
bruce
Posts: 20824
Joined: Thu Nov 29, 2007 10:13 pm
Location: So. Cal.

Re: PCI-e bandwidth/capacity limitations

Post by bruce »

Aurum wrote:I just ordered these 1x risers hoping they'll fit in the 1x slot tucked under an EVGA card plugged into a 16x slot.
https://www.amazon.com/gp/product/B017Q ... UTF8&psc=1
Does anybody know if this adapter (or something like it) will fit UNDER the HS/Fan on a GPU that's two slots wide?
foldy
Posts: 2040
Joined: Sat Dec 01, 2012 3:43 pm
Hardware configuration: Folding@Home Client 7.6.13 (1 GPU slots)
Windows 7 64bit
Intel Core i5 2500k@4Ghz
Nvidia gtx 1080ti driver 441

Re: PCI-e bandwidth/capacity limitations

Post by foldy »

@NGBRO: For 1060 3GB 338k PPD compared to 1060 6GB 375k PPD. When I substract 15% because of slower pcie slot then this is 287k PPD and matches your GPU speed when not thermal throttling anymore. Your CPU and RAM looks good.

@Aurum: Do you know why your 2nd and 3rd slot give such performance difference although look equal?
x16 2.0, x8 2.0, x8 2.0 & x1 2.0
380905, 358867, 265628 and 101697
Aurum
Posts: 292
Joined: Sat Oct 03, 2015 3:15 pm
Location: The Great Basin

Re: PCI-e bandwidth/capacity limitations

Post by Aurum »

Since all four cards are the same I have no way of telling which is which with F@H. I can tell with GPU-Z or if I have 4 different kinds of cards. I just sorted from highest to lowest. From GPU-Z I can tell you that the 1060-3GB card is in the x16 2.0 slot and the other 3 cards are 1060-6GB.
In Science We Trust Image
foldy
Posts: 2040
Joined: Sat Dec 01, 2012 3:43 pm
Hardware configuration: Folding@Home Client 7.6.13 (1 GPU slots)
Windows 7 64bit
Intel Core i5 2500k@4Ghz
Nvidia gtx 1080ti driver 441

Re: PCI-e bandwidth/capacity limitations

Post by foldy »

@Aurum
Do all GPUs run with same clock speed? I calculated 2nd GPU is 25% faster than 3rd GPU although both are 1060-6GB and use pcie 2.0 x8. That would match if 2nd GPU runs at 2000 Mhz while 3rd GPU runs at 1600 Mhz.

As the 1060-6GB has 10% more shaders than 1060-3GB I would recommend to put that in the x16 2.0 slot to minimize pcie bandwidth losses if any.
bruce
Posts: 20824
Joined: Thu Nov 29, 2007 10:13 pm
Location: So. Cal.

Re: PCI-e bandwidth/capacity limitations

Post by bruce »

On GPU-Z the "bus interface" will say something like this: PCIe x16 2.0 @ x4 2.0.

List all of them.
NGBRO
Posts: 12
Joined: Mon Apr 08, 2013 10:49 am

Re: PCI-e bandwidth/capacity limitations

Post by NGBRO »

I hit the absolute ceiling for my GTX1060 3GB (if it's below 65degC and hits 1900Mhz) at about 285k PPD. I think that's as high as it'd go on my "ancient" motherboard with PCIe 1.1 x16. If I can recoup the power cost and even make a profit from merged folding, I think it'd be a good bet to cash in on a cheap LGA1150 Pentium setup with PCIe 3.0.

Meanwhile, running with case side open and in open space, 100% fan, it tops out 76degC and 1823Mhz, 275k+ PPD? It's hard to get better ventilation than this as my case is a cheapo one with no provisions for mounting fans.
foldy
Posts: 2040
Joined: Sat Dec 01, 2012 3:43 pm
Hardware configuration: Folding@Home Client 7.6.13 (1 GPU slots)
Windows 7 64bit
Intel Core i5 2500k@4Ghz
Nvidia gtx 1080ti driver 441

Re: PCI-e bandwidth/capacity limitations

Post by foldy »

Looks good. You could put a case fan next to the GPU using cable ties. Or you can make some holes in the case using a drill machine - now you can mount fans.
boristsybin
Posts: 50
Joined: Mon Jan 16, 2017 11:40 am
Hardware configuration: 4x1080Ti + 2x1050Ti
Location: Russia, Moscow

Re: PCI-e bandwidth/capacity limitations

Post by boristsybin »

Any experience to get more than 250k ppd from pci-e x1 2.0 on Linux? Looking for best card for thin pci-e bus.
Image
ComputerGenie
Posts: 230
Joined: Mon Dec 12, 2016 4:06 am

Re: PCI-e bandwidth/capacity limitations

Post by ComputerGenie »

foldy wrote:Looks good. You could put a case fan next to the GPU using cable ties. Or you can make some holes in the case using a drill machine - now you can mount fans.
I've actually done that :oops:
When I first put my 1 rig in my "miner room", the rack position wasn't ideal for a PC. So I took a spare fan for my S9s, plugged in as the CPU opt fan, and ran a zip-tie through the top 2 holes in the fan and each of the cards. Looked and sounded like crap, but 5,600 RPMs of a 12cm fan will keep em cool. :P
Aurum
Posts: 292
Joined: Sat Oct 03, 2015 3:15 pm
Location: The Great Basin

Re: PCI-e bandwidth/capacity limitations

Post by Aurum »

Today ryoungblood over took Ed in daily PPD :!: Congrats, you beat me too it :shock: :D :lol:
Do tell, what's the config of your latest folding rigs :?:
http://folding.extremeoverclocking.com/ ... 1&t=224497
ryoungblood wrote:FYI, those unshielded risers have a chance to drop your PPD. There is an ongoing discussion of this in the Overclockers.com Folding Forum. The EZDIY risers from Amazon are great.
I've heard this but I've also heard not to use magnetic tools on PCs, yeah back when they had 5" floppy drives, not today. What's the physics of this interference??? Aren't the PCIe signals digital? Are some lines analog AC??? Is it parasitic capacitance??? Besides if you shield the ribbon cable outside how does that reduce interference line-to-line. If strong fields are coming from somewhere else then one should be able to insert a Gauss meter and detect them.
Risers have been so poorly made they're just trash. I just got some Gigabyte Aorus 1080 Ti's and those monsters are fat!!! I'm going to have put them up on risers so it's time I break down and buy EZDIY unpowered risers.
In Science We Trust Image
ComputerGenie
Posts: 230
Joined: Mon Dec 12, 2016 4:06 am

Re: PCI-e bandwidth/capacity limitations

Post by ComputerGenie »

It's really hard to get an accurate comparison of shielded vs non-shielded because of quality differences. That being said, I ditched all of my old, cheap non-shielded risers and replaces them with the "Thermaltake TT Gaming PCI-E x16 3.0" (looks about the same as the EZDIY version, but a tad shorter) and it made a big difference. The most notable difference was during boot up, the z170 boards do NOT want to boot in any timely fashion with more than one 1080 on the non-shielded risers (even powered versions), but with the Thermaltakes there was no noticeable difference vs cards directly in slots. As far as PPD difference goes, there is no discernible difference between the Thermaltakes and direct slotted (where as it was kind of "hit and miss" as to if the non-shielded would work correctly). I wish I had a comparable set of shielded vs non-shielded to test the difference that was in the shielding alone, but I'm totally pleased with the more expensive, higher quality version.
Post Reply