BiFrost - 6 GPUs on x570 with m.2 to PCIex4 Adapters

A forum for discussing FAH-related hardware choices and info on actual products (not speculation).

Moderator: Site Moderators

Forum rules
Please read the forum rules before posting.
gordonbb
Posts: 511
Joined: Mon May 21, 2018 4:12 pm
Hardware configuration: Ubuntu 22.04.2 LTS; NVidia 525.60.11; 2 x 4070ti; 4070; 4060ti; 3x 3080; 3070ti; 3070
Location: Great White North

BiFrost - 6 GPUs on x570 with m.2 to PCIex4 Adapters

Post by gordonbb »

Finally received my m.2 to PCIe3x4 Adapters today so I started building my new folding Rig.

I'm reusing a Gigabyte x570 Pro WiFi motherboard with an AMD Ryzen 7 2700x CPU and 8GB of DDR4-2400 in a Veddha 6-GPU Mining Frame. Rather than using USB PCIe3 x1 risers I'm using an Asus Hyper M.2 v2 Card to split the primary PCIe3 x16 slot into 4 PCIe3 x4 m.2 slots then using those 4 slots and the two integrated onto the motherboard with the powered m.2 to PCIe3 adapters to drive 6 GPUs.

Initial testing is showing no discernible loss in performance.
Image
Paragon
Posts: 137
Joined: Fri Oct 21, 2011 3:24 am
Hardware configuration: Rig1 (Dedicated SMP): AMD Phenom II X6 1100T, Gigabyte GA-880GMA-USB3 board, 8 GB Kingston 1333 DDR3 Ram, Seasonic S12 II 380 Watt PSU, Noctua CPU Cooler

Rig2 (Part-Time GPU): Intel Q6600, Gigabyte 965P-S3 Board, EVGA 460 GTX Graphics, 8 GB Kingston 800 DDR2 Ram, Seasonic Gold X-650 PSU, Artic Cooling Freezer 7 CPU Cooler
Location: United States

Re: BiFrost - 6 GPUs on x570 with m.2 to PCIex4 Adapters

Post by Paragon »

Great idea and write-up. Any thought of putting together some system level efficiency plots (PPD/wattage) at the wall? I'm curious to see how this stacks up against typical 2-3 GPU machines. It should be more efficient overall since you are running as much hardware as you can off of 1 CPU + board. I'd also be interested in seeing a few data points of GPU power target vs. PPD and power target vs. PPD/Watt
rwh202
Posts: 410
Joined: Mon Nov 15, 2010 8:51 pm
Hardware configuration: 8x GTX 1080
3x GTX 1080 Ti
3x GTX 1060
Various other bits and pieces
Location: South Coast, UK

Re: BiFrost - 6 GPUs on x570 with m.2 to PCIex4 Adapters

Post by rwh202 »

Yes, thanks for the write-up.

The easy availability of mining-class hardware opens up a lot more possibilities.

Tempted to downsize and go this route myself - less hassle looking after a couple of 6 GPU rigs than having them split across 20 chassis.
gordonbb
Posts: 511
Joined: Mon May 21, 2018 4:12 pm
Hardware configuration: Ubuntu 22.04.2 LTS; NVidia 525.60.11; 2 x 4070ti; 4070; 4060ti; 3x 3080; 3070ti; 3070
Location: Great White North

Re: BiFrost - 6 GPUs on x570 with m.2 to PCIex4 Adapters

Post by gordonbb »

Paragon wrote:Great idea and write-up. Any thought of putting together some system level efficiency plots (PPD/wattage) at the wall? I'm curious to see how this stacks up against typical 2-3 GPU machines. It should be more efficient overall since you are running as much hardware as you can off of 1 CPU + board. I'd also be interested in seeing a few data points of GPU power target vs. PPD and power target vs. PPD/Watt
I have one power supply on a UPS and the other on a Kill-a-Watt currently. I need to procure a second 1500VA/900W UPS to start getting efficiency numbers and the other issue is the Network UPS Tools (NUT) is constantly losing connections to the UPS through the USB interface and I will need to be able to log both UPSes and sum their output power to calculate overall efficiency. I miss my other UPSes with the Network Interfaces but they are 700VA/450W units.

Back of Napkin Calculation - UPS+Kill-a-Watt = 1420W - Reported PPD 7.59MPPD so 5.35kPPD/W

I'm also running a BOINC Project (World Community Grid) on the remaining 10 threads of the CPU so I'd have to disable that for a while to assess its overhead but it's likely only about 50-70W.

After 24 hours an analysis of WUs processed on the 2080 Super and 2070 Super show that the performance is well within one Standard Deviation of the performance observed for the same WUs when the GPUs were installed in the PCIe3 x16 slots at x8/x8. So far no appreciable difference in performance moving from x8 to x4 and the rest of the GPUs are producing at similar averages that were observed previously:

GTX 1070 Ti - 895.92kPPD (+150MHz o/c)
RTX 2060 - 1.04MPPD
RTX 2060 S - 1.26MPPD
RTX 2070 - 1.34MPPD (+60MHz o/c)
RTX 2070 S - 1.51MPPD
RTX 2080 S - 1.84MPPD

It's going down to -25C here tonight so my 1500W space heater will help keep the basement warm :-)
Image
gordonbb
Posts: 511
Joined: Mon May 21, 2018 4:12 pm
Hardware configuration: Ubuntu 22.04.2 LTS; NVidia 525.60.11; 2 x 4070ti; 4070; 4060ti; 3x 3080; 3070ti; 3070
Location: Great White North

Re: BiFrost - 6 GPUs on x570 with m.2 to PCIex4 Adapters

Post by gordonbb »

rwh202 wrote:Yes, thanks for the write-up.

The easy availability of mining-class hardware opens up a lot more possibilities.

Tempted to downsize and go this route myself - less hassle looking after a couple of 6 GPU rigs than having them split across 20 chassis.
The mining chassis was discounted almost 60%. The final piece of the puzzle was the m.2 to PCIe3 x4 adapters. I'm really impressed with their build quality and the selection.

One downside to having 6 GPUs is should one slot get stuck, like in the recent issues, the same system might have to be constantly rebooted.
Image
Akaanc
Posts: 22
Joined: Fri Sep 27, 2019 8:20 pm

Re: BiFrost - 6 GPUs on x570 with m.2 to PCIex4 Adapters

Post by Akaanc »

I would like to watch a video of how you built it. very nice..
Paragon
Posts: 137
Joined: Fri Oct 21, 2011 3:24 am
Hardware configuration: Rig1 (Dedicated SMP): AMD Phenom II X6 1100T, Gigabyte GA-880GMA-USB3 board, 8 GB Kingston 1333 DDR3 Ram, Seasonic S12 II 380 Watt PSU, Noctua CPU Cooler

Rig2 (Part-Time GPU): Intel Q6600, Gigabyte 965P-S3 Board, EVGA 460 GTX Graphics, 8 GB Kingston 800 DDR2 Ram, Seasonic Gold X-650 PSU, Artic Cooling Freezer 7 CPU Cooler
Location: United States

Re: BiFrost - 6 GPUs on x570 with m.2 to PCIex4 Adapters

Post by Paragon »

Awesome! Thanks for the efficiency update...that makes sense with what I was thinking. I wonder...has anyone tried doing this with mining-specific GPUs? The P102-100, P104-100, and P106-100 or P106-90 are Pascal based mining cards (1080ti, 1070, and 1060 I believe) with no video outputs and a few tweaks to the bios. They have depreciated like crazy since once they become too slow for mining, people can't game on them due to the lack of a video out. Seems like we might be able to get some really cheap cards...
MeeLee
Posts: 1339
Joined: Tue Feb 19, 2019 10:16 pm

Re: BiFrost - 6 GPUs on x570 with m.2 to PCIex4 Adapters

Post by MeeLee »

Just make sure you don't use 2 PSUs on one motherboard, nor 2 USPs on one system!

I bought myself a set on your recommendation. It's a good solution for RTX 2060-2080 GPUs.
gordonbb
Posts: 511
Joined: Mon May 21, 2018 4:12 pm
Hardware configuration: Ubuntu 22.04.2 LTS; NVidia 525.60.11; 2 x 4070ti; 4070; 4060ti; 3x 3080; 3070ti; 3070
Location: Great White North

Re: BiFrost - 6 GPUs on x570 with m.2 to PCIex4 Adapters

Post by gordonbb »

MeeLee wrote:Just make sure you don't use 2 PSUs on one motherboard, nor 2 USPs on one system!

I bought myself a set on your recommendation. It's a good solution for RTX 2060-2080 GPUs.
Based on anecdotal evidence I’ve seen I powered all the m.2 to PCIe3 x4 risers off the same PSU as the motherboard as said evidence suggested the PCIe interface of the GPUs was powered from the same +12V rail that came from the interface and it was best to do this to avoid differences in ground potentials which could, if high enough, harm the motherboard or components on it.

In an attempt to balance the loads across two supplies PS0 powers the motherboard, all the risers and the two lowest power GPUs and PS1 powers the 4 higher powered GPUs.

So a rule of thumb, I believe, could be stated as:
You must power the risers from the same PSU as the motherboard but you can connect a GPUs PCIe power connectors to any power supply.

This is a subtle point but one I believe is important and is part of the trade-off of going with 2 lower capacity power supplies rather than a single large supply for cost savings.
Image
MeeLee
Posts: 1339
Joined: Tue Feb 19, 2019 10:16 pm

Re: BiFrost - 6 GPUs on x570 with m.2 to PCIex4 Adapters

Post by MeeLee »

My advice,
Don't do it!
I've blown 2 EVGA Bronze + PSUs, and one Gold.
Funny enough, the corsair was ok with it.
even risers send voltage back to the mobo.
And it's not the mobo or hardware that suffers; in my case it was the PSUs that blew.
It's also not really the voltage difference. Since it's 12V, and goes from 11,4-12.5V, PSUs can handle it quite well.
It's when you turn off the PSUs, you'd have to turn them both off at exactly the same time. and hope that one PSU doesn't shut off faster than another...
Good luck though...
gordonbb
Posts: 511
Joined: Mon May 21, 2018 4:12 pm
Hardware configuration: Ubuntu 22.04.2 LTS; NVidia 525.60.11; 2 x 4070ti; 4070; 4060ti; 3x 3080; 3070ti; 3070
Location: Great White North

Re: BiFrost - 6 GPUs on x570 with m.2 to PCIex4 Adapters

Post by gordonbb »

MeeLee wrote:My advice,
Don't do it!
I've blown 2 EVGA Bronze + PSUs, and one Gold.
Funny enough, the corsair was ok with it.
even risers send voltage back to the mobo.
And it's not the mobo or hardware that suffers; in my case it was the PSUs that blew.
It's also not really the voltage difference. Since it's 12V, and goes from 11,4-12.5V, PSUs can handle it quite well.
It's when you turn off the PSUs, you'd have to turn them both off at exactly the same time. and hope that one PSU doesn't shut off faster than another...
Good luck though...
I’m using a Thermaltake power supply adapter that ties the power on signal from the first power supply to the second so, in theory, they should both turn off and on fairly close to each other. And they’re matched Corsair RMx 750W power supplies. No problems so far.
Image
gordonbb
Posts: 511
Joined: Mon May 21, 2018 4:12 pm
Hardware configuration: Ubuntu 22.04.2 LTS; NVidia 525.60.11; 2 x 4070ti; 4070; 4060ti; 3x 3080; 3070ti; 3070
Location: Great White North

Re: BiFrost - 6 GPUs on x570 with m.2 to PCIex4 Adapters

Post by gordonbb »

The results are in.

Going from x8 to x4 under Linux the 2080 Super showed a 2.48% decrease in Production (PPD) while the 2070 Super showed a 0.34% increase in production and both values being within a Standard Deviation.

We can say with some confidence:
Under Linux a higher-end NVidia GPU would see no significant decrease in Production moving from a PCIe3 x8 to a PCIe3 x4 slot.

Comparing Windows to Linux Production on a PCIe3 x4 link we observed:

-29.3% RTX 2080 Super
-21.6% RTX 2070 Super
-18.1% RTX 2070
-23.0% RTX 2060 Super
-21.1% RTX 2060
-20.0% GTX 1070 Ti

The RTX 2080 Super's Production appears to suffer the most under Windows on an x4 link but this may have been compounded by it being connected to the x570 motherboard slot connected to the ChipSet and hence further limited by the shared PCIe3 x4 link between the Ryzen 7 2700 CPU and the ChipSet.

The general observation is:

Under Windows a higher-end NVidia GPU on a PCIe3 x4 link will see a 20% decrease in Folding at Home Production in Points per Day (PPD) compared to the same card under Linux.
Image
MeeLee
Posts: 1339
Joined: Tue Feb 19, 2019 10:16 pm

Re: BiFrost - 6 GPUs on x570 with m.2 to PCIex4 Adapters

Post by MeeLee »

What riser adapters you use?
My card arrived, has m.2 to PCIE 4x adapters, but I fail to find PCIE 4x to 16x ribbon cables.
The only option I currently have, is a PCIE 4x to 16x adaptor plug, but it sticks out of the card.
Any ideas?
gordonbb
Posts: 511
Joined: Mon May 21, 2018 4:12 pm
Hardware configuration: Ubuntu 22.04.2 LTS; NVidia 525.60.11; 2 x 4070ti; 4070; 4060ti; 3x 3080; 3070ti; 3070
Location: Great White North

Re: BiFrost - 6 GPUs on x570 with m.2 to PCIex4 Adapters

Post by gordonbb »

MeeLee wrote:What riser adapters you use?
My card arrived, has m.2 to PCIE 4x adapters, but I fail to find PCIE 4x to 16x ribbon cables.
The only option I currently have, is a PCIE 4x to 16x adaptor plug, but it sticks out of the card.
Any ideas?
The R42 series m.2 to PCIe3x4 adapters from ADT-Link have an open back and so can fit on the front contacts of an PCIe3 x16 Card but I ordered the ones with a mechanical PCIe3x16 connector but electrically only x4 wired so they would have a more solid mechanical connection.

I’m using ADT-Link R43 STX m.2 to PCIe3 x16 adapters. Specifically:
R43MR 25cm
R43UR 10cm
R43UL 15cm
R43UL 25cm
R43UL 35cm
R43ML 50cm

For the upper m.2 slot, then the Asus Hyper M.2 v2 card from the top to bottom, and lastly the lower m.2 slot but this is for a Gigabyte x570 Pro WiFi on a Veddha Deluxe 6-Bay miner case so your cable lengths will vary depending on the slot placement on your motherboard and the chassis used.

I stripped the connectors off an old IDE ribbon cable then peeled it down to about the same width as the cables on the adapters, marked it at 5cm lengths, and used that to measure the cable lengths for ordering the adapters.

If your adaptor has a solid back on the PCIe x4 socket you may be able to cut a notch in the back of it but you’d have to be careful not to damage the pins or leave any plastic shaving in the connector.
Image
HaloJones
Posts: 906
Joined: Thu Jul 24, 2008 10:16 am

Re: BiFrost - 6 GPUs on x570 with m.2 to PCIex4 Adapters

Post by HaloJones »

Insane! I would never have thought of this; hell, I didn't even know you could slot a M2 card into a PCIE slot.

Great work!
single 1070

Image
Post Reply