I have a Mac Pro that runs High Sierra. It's CPUs are churning away running CPU work units of Folding and other projects.
But alas, none of them support GPUs on Macs, and this Mac has a decent GPU. To be specific, it has an OEM low end nVidia GPU, and an older mid-range AMD GPU, which I'm trying to sell and trade up to a modern mid-range GPU. The monitor is on the older nvidia card anyway. Is it possible for me to some how grand the Windows virtual machine full hardware access to the GPU so I can run GPU units in windows, while having everything else run on the Mac?
The machine has PLENTY of RAM so running two OS's shouldn't be an issue at all.
Also note, this is VMWare Fusion, their Mac VM client.
Accessing GPU inside a VM Bubble, Is It Possible?
Moderators: Site Moderators, FAHC Science Team
Re: Accessing GPU inside a VM Bubble, Is It Possible?
I could also use VirtualBox since it's free. If it can be done, it would have to be in one of those two options.
Re: Accessing GPU inside a VM Bubble, Is It Possible?
The Vbox doesn't support GPUs. You have to use a PCI pass through for that to work and as far as I know, only VSphere Hypervisor can do that. You can use it for free, but I finally never tried it.
https://www.vmware.com/products/vsphere-hypervisor.html
https://www.vmware.com/products/vsphere-hypervisor.html
-
- Site Moderator
- Posts: 6986
- Joined: Wed Dec 23, 2009 9:33 am
- Hardware configuration: V7.6.21 -> Multi-purpose 24/7
Windows 10 64-bit
CPU:2/3/4/6 -> Intel i7-6700K
GPU:1 -> Nvidia GTX 1080 Ti
§
Retired:
2x Nvidia GTX 1070
Nvidia GTX 675M
Nvidia GTX 660 Ti
Nvidia GTX 650 SC
Nvidia GTX 260 896 MB SOC
Nvidia 9600GT 1 GB OC
Nvidia 9500M GS
Nvidia 8800GTS 320 MB
Intel Core i7-860
Intel Core i7-3840QM
Intel i3-3240
Intel Core 2 Duo E8200
Intel Core 2 Duo E6550
Intel Core 2 Duo T8300
Intel Pentium E5500
Intel Pentium E5400 - Location: Land Of The Long White Cloud
- Contact:
Re: Accessing GPU inside a VM Bubble, Is It Possible?
In addition to what ajm mentioned, you would need 2 GPUs. 1 GPU for the Host system and 1 GPU for the VM to fold on. I am aware that Linux would be more efficient (higher PPD) with the current set of FahCore (the software that does the folding) but with AMD GPUs, it might be a bit tricky to get the GPU to fold under Linux. The initial steps would be to install the correct AMD drivers which has OpenCL support and then the OpenCL package (sudo apt-get install ocl-icd-opencl-dev) and then the FAHClient.
ETA:
Now ↞ Very Soon ↔ Soon ↔ Soon-ish ↔ Not Soon ↠ End Of Time
Welcome To The F@H Support Forum Ӂ Troubleshooting Bad WUs Ӂ Troubleshooting Server Connectivity Issues
Now ↞ Very Soon ↔ Soon ↔ Soon-ish ↔ Not Soon ↠ End Of Time
Welcome To The F@H Support Forum Ӂ Troubleshooting Bad WUs Ӂ Troubleshooting Server Connectivity Issues
Re: Accessing GPU inside a VM Bubble, Is It Possible?
Linux is indeed significantly more efficient (than Windows) for CPU folding, but I haven't been able to demonstrate any consistent advantage when folding on GPUs, be it AMD's or Nvidia's.
-
- Site Admin
- Posts: 7937
- Joined: Tue Apr 21, 2009 4:41 pm
- Hardware configuration: Mac Pro 2.8 quad 12 GB smp4
MacBook Pro 2.9 i7 8 GB smp2 - Location: W. MA
Re: Accessing GPU inside a VM Bubble, Is It Possible?
With the previous GPU folding core, Core_21, there was a definite linux advantage. Improvements in the Windows code that were included in Core_22 brought Windows and Linux GPU folding to basically even.ajm wrote:Linux is indeed significantly more efficient (than Windows) for CPU folding, but I haven't been able to demonstrate any consistent advantage when folding on GPUs, be it AMD's or Nvidia's.
iMac 2.8 i7 12 GB smp8, Mac Pro 2.8 quad 12 GB smp6
MacBook Pro 2.9 i7 8 GB smp3
Re: Accessing GPU inside a VM Bubble, Is It Possible?
There's still an advantage on linux,
But the PPD difference isn't noticed much because of the less consistent PPD numbers per project on core 22, and due to the fact that Core 22 has much larger numbers.
If a 2080Ti gets 2.1M PPD in Windows, it gets 2.4M PPD on the same hardware in Linux.
On core 22 I've managed to get up to 4.4M PPD on the highest core 22 projects (stock cooler, OC and power capped), while most Windows users reported 3,6 to 4.1M PPD.
Just one user I remember, has reported 4.8M PPD on one WU, on a watercooled ROG STRIX, which by nature have the best binned GPUs.
The 0.4M PPD difference in PPD between his and my GPU, is mostly QRB. The actual difference in amount of work per hour would have probably been very little (1995Mhz on mine, vs 2050-2075Mhz on his).
But there's still a distinct difference between Linux and Windows.
But the PPD difference isn't noticed much because of the less consistent PPD numbers per project on core 22, and due to the fact that Core 22 has much larger numbers.
If a 2080Ti gets 2.1M PPD in Windows, it gets 2.4M PPD on the same hardware in Linux.
On core 22 I've managed to get up to 4.4M PPD on the highest core 22 projects (stock cooler, OC and power capped), while most Windows users reported 3,6 to 4.1M PPD.
Just one user I remember, has reported 4.8M PPD on one WU, on a watercooled ROG STRIX, which by nature have the best binned GPUs.
The 0.4M PPD difference in PPD between his and my GPU, is mostly QRB. The actual difference in amount of work per hour would have probably been very little (1995Mhz on mine, vs 2050-2075Mhz on his).
But there's still a distinct difference between Linux and Windows.
Re: Accessing GPU inside a VM Bubble, Is It Possible?
Punctually, you can find anything and its contrary, but if you pay attention, if you note down all results during a period of time using Windows and then Linux, on the same kit, Linux doesn't produces more on GPUs.
https://docs.google.com/spreadsheets/d/ ... edit#gid=0
https://docs.google.com/spreadsheets/d/ ... edit#gid=0
Re: Accessing GPU inside a VM Bubble, Is It Possible?
I do have two GPU's anyway so that's not a problem. I'll poke around with that hypervisor thing once the new card comes in.