Regarding any new WUs that get produced, could they all be writen is such a way that they run exclusively on GPUs, or do some
of the research must absoluterly call upon instruction sets that only CPUs can execute?

Moderators: Site Moderators, FAHC Science Team
My only gripe would be that wouldn't said underutilized GPU*if it was used for low count atoms instead of a cpu*JimboPalmer wrote:Historically, there used to be WUs that had to be done on CPUs. Until 'recently' GPUs could not solve problems with explicit solvation while CPUs could. This has been solved.*
https://en.wikipedia.org/wiki/Solvent_model
Today, they choose CPU and GPU work by the complexity of the molecule. smaller molecules would 'waste' large GPU cards, as small molecules do not have enough bonds to occupy all the shaders. Those that are small enough, occupy the CPUs that are so abundant. As GPUs and CPUs get more complex (there are Epycs with 128 cores/256 threads) the size of 'small' molecules is also increasing.
*some part of "Why doesn't my ancient GPU work any more, I used to fold on it in 2018?" is that the researchers had to code for features old GPUs do not have.