Six native petaFLOPS
Moderators: Site Moderators, FAHC Science Team
-
- Site Moderator
- Posts: 2850
- Joined: Mon Jul 18, 2011 4:44 am
- Hardware configuration: OS: Windows 10, Kubuntu 19.04
CPU: i7-6700k
GPU: GTX 970, GTX 1080 TI
RAM: 24 GB DDR4 - Location: Western Washington
Six native petaFLOPS
I noticed today that Folding@home finally has enough computing power to cross over the 6 native petaFLOP barrier. I find this quite impressive, and wanted to point it out. From the OS stats page I saw this:
This accomplishment really is remarkable considering that according to Dr. Pande's blog the 5 petaFLOP milestone was passed in February of 2009. And as you can see we have about 7.9 x86 petaFLOPS, from about 447,000 active CPUs, about 35,000 active GPUs, and about 6,000 active PS3s. I find it quite amazing that such a large, powerful, and diverse system can be voluntarily amassed and utilized.
For some FLOP-to-FLOP comparisons:
Other DC projects: As of today, the BOINC stats reports that the combined efforts of all distributed computing projects using the BOINC middleware add up to 5.262 petaFLOPS. It is my understanding that BOINC is mostly CPU-based, so it speaks well for them that they can assemble that much computing power. There are so many BOINC projects that are greatly assist researchers in many different scientific areas. For some examples, PrimeGrid itself currently sits at about 1.4 petaFLOPS, (they use GPUs too apparently) with Milkyway@home at 0.578 petaFLOPS, and Collatz Conjecture at 0.545 petaFLOPS, and there are many other BOINC-based projects that are focused on various problems. I'm glad that Folding@home ensures data integrity using methods other than the "quorum" approach, even though the client has to be closed-source to do so.
Supercomputers: Although DC projects have a vastly different architecture than supercomputers, as they were measured in June of 2011, Japan's K supercomputer stands at 8.162 petaFLOPS, and China's Tianhe-1A supercomputer runs at 2.566 petaFLOPS. Those numbers are also quite impressive. How amazing it is that we have this kind of computing power available!
I just wanted to throw in those numbers for a sort of comparison. I hope we stay above this level, and I'm confident that v7's default configuration of SMP+GPU will greatly assist both Folding@home's FLOP numbers and its overall scientific production. How great it is that we can be a part of this massive effort!
This accomplishment really is remarkable considering that according to Dr. Pande's blog the 5 petaFLOP milestone was passed in February of 2009. And as you can see we have about 7.9 x86 petaFLOPS, from about 447,000 active CPUs, about 35,000 active GPUs, and about 6,000 active PS3s. I find it quite amazing that such a large, powerful, and diverse system can be voluntarily amassed and utilized.
For some FLOP-to-FLOP comparisons:
Other DC projects: As of today, the BOINC stats reports that the combined efforts of all distributed computing projects using the BOINC middleware add up to 5.262 petaFLOPS. It is my understanding that BOINC is mostly CPU-based, so it speaks well for them that they can assemble that much computing power. There are so many BOINC projects that are greatly assist researchers in many different scientific areas. For some examples, PrimeGrid itself currently sits at about 1.4 petaFLOPS, (they use GPUs too apparently) with Milkyway@home at 0.578 petaFLOPS, and Collatz Conjecture at 0.545 petaFLOPS, and there are many other BOINC-based projects that are focused on various problems. I'm glad that Folding@home ensures data integrity using methods other than the "quorum" approach, even though the client has to be closed-source to do so.
Supercomputers: Although DC projects have a vastly different architecture than supercomputers, as they were measured in June of 2011, Japan's K supercomputer stands at 8.162 petaFLOPS, and China's Tianhe-1A supercomputer runs at 2.566 petaFLOPS. Those numbers are also quite impressive. How amazing it is that we have this kind of computing power available!
I just wanted to throw in those numbers for a sort of comparison. I hope we stay above this level, and I'm confident that v7's default configuration of SMP+GPU will greatly assist both Folding@home's FLOP numbers and its overall scientific production. How great it is that we can be a part of this massive effort!
F@h is now the top computing platform on the planet and nothing unites people like a dedicated fight against a common enemy. This virus affects all of us. Lets end it together.
Re: Six native petaFLOPS
I wonder if someone can report on the accuracy of these numbers.
The ATI GPU active CPUS recently increased by a factor of 3X, from around 8000 reported for many months to suddenly the current 26000.
Is there a possibility that recent server changes have affected this reporting ?
Has a supercomputer with 16000 ATI GPU units recently come online ?
Or have that many converts to GPU client accrued in the last 2 months ?
The ATI GPU active CPUS recently increased by a factor of 3X, from around 8000 reported for many months to suddenly the current 26000.
Is there a possibility that recent server changes have affected this reporting ?
Has a supercomputer with 16000 ATI GPU units recently come online ?
Or have that many converts to GPU client accrued in the last 2 months ?
Transparency and Accountability, the necessary foundation of any great endeavor!
-
- Posts: 10179
- Joined: Thu Nov 29, 2007 4:30 pm
- Hardware configuration: Intel i7-4770K @ 4.5 GHz, 16 GB DDR3-2133 Corsair Vengence (black/red), EVGA GTX 760 @ 1200 MHz, on an Asus Maximus VI Hero MB (black/red), in a blacked out Antec P280 Tower, with a Xigmatek Night Hawk (black) HSF, Seasonic 760w Platinum (black case, sleeves, wires), 4 SilenX 120mm Case fans with silicon fan gaskets and silicon mounts (all black), a 512GB Samsung SSD (black), and a 2TB Black Western Digital HD (silver/black).
- Location: Arizona
- Contact:
Re: Six native petaFLOPS
The new bigger GPU WUs using the GPUs to their fuller potential?
Getting more regular instead of bigadv SMP work units, freeing up a few cycles for the GPUs to use?
Getting more regular instead of bigadv SMP work units, freeing up a few cycles for the GPUs to use?
How to provide enough information to get helpful support
Tell me and I forget. Teach me and I remember. Involve me and I learn.
Tell me and I forget. Teach me and I remember. Involve me and I learn.
Re: Six native petaFLOPS
Yeah its only a 3.5 % change in comparison to the total base.
Perhaps more people are now willing to GPU to help heat in the winter.
All good if accurate, it was just the abruptness of the change that I noticed.
Perhaps more people are now willing to GPU to help heat in the winter.
All good if accurate, it was just the abruptness of the change that I noticed.
Transparency and Accountability, the necessary foundation of any great endeavor!
Re: Six native petaFLOPS
Colder times are helping the numbers to grow, they have always done that in the winter times.mdk777 wrote:Yeah its only a 3.5 % change in comparison to the total base.
Perhaps more people are now willing to GPU to help heat in the winter.
All good if accurate, it was just the abruptness of the change that I noticed.
-
- Site Moderator
- Posts: 2850
- Joined: Mon Jul 18, 2011 4:44 am
- Hardware configuration: OS: Windows 10, Kubuntu 19.04
CPU: i7-6700k
GPU: GTX 970, GTX 1080 TI
RAM: 24 GB DDR4 - Location: Western Washington
Re: Six native petaFLOPS
That makes sense. We are now at 6.2 native petaFLOPS, about 8.1 x86 petaFLOPS, and it looks like it's mostly due to the 1,700 ATI GPUs that were fired up.Ivoshiee wrote:Colder times are helping the numbers to grow, they have always done that in the winter times.mdk777 wrote:Yeah its only a 3.5 % change in comparison to the total base.
Perhaps more people are now willing to GPU to help heat in the winter.
All good if accurate, it was just the abruptness of the change that I noticed.
Impressive!
F@h is now the top computing platform on the planet and nothing unites people like a dedicated fight against a common enemy. This virus affects all of us. Lets end it together.
Re: Six native petaFLOPS
I think i have the answer to the question: Bitcoin
A lot of people bought a lot of ATI gpus to mine bitcoins but with the recent price drops, they started using their ATI for something else, especially Folding.
And yes, there are a LOT of ATI busy with bitcoin, really a lot.
A lot of people bought a lot of ATI gpus to mine bitcoins but with the recent price drops, they started using their ATI for something else, especially Folding.
And yes, there are a LOT of ATI busy with bitcoin, really a lot.
-
- Site Moderator
- Posts: 2850
- Joined: Mon Jul 18, 2011 4:44 am
- Hardware configuration: OS: Windows 10, Kubuntu 19.04
CPU: i7-6700k
GPU: GTX 970, GTX 1080 TI
RAM: 24 GB DDR4 - Location: Western Washington
Re: Six native petaFLOPS
Hmm. Perhaps. I think it may have something to do with PPD, points/watt, or server availability. Anyway, Bitcoin isn't a distributed computing project, it's a just a electronic currency platform. But it's their GPU, and I understand GPUs are faster than CPUs at Bitcoin's currency hashes, so I can see them doing that if they wish. But as for me, I prefer using real dollars while my computer is furiously doing disease research.gabi.2437 wrote:I think i have the answer to the question: Bitcoin
A lot of people bought a lot of ATI gpus to mine bitcoins but with the recent price drops, they started using their ATI for something else, especially Folding.
And yes, there are a LOT of ATI busy with bitcoin, really a lot.
F@h is now the top computing platform on the planet and nothing unites people like a dedicated fight against a common enemy. This virus affects all of us. Lets end it together.
-
- Posts: 533
- Joined: Tue May 27, 2008 11:56 pm
- Hardware configuration: Parts:
Asus H370 Mining Master motherboard (X2)
Patriot Viper DDR4 memory 16gb stick (X4)
Nvidia GeForce GTX 1080 gpu (X16)
Intel Core i7 8700 cpu (X2)
Silverstone 1000 watt psu (X4)
Veddha 8 gpu miner case (X2)
Thermaltake hsf (X2)
Ubit riser card (X16) - Location: ames, iowa
Re: Six native petaFLOPS
bitcoin -- meh!
Re: Six native petaFLOPS
I'm not here to start a flame about bitcoin, i just stated that the ATI increase is due to bitcoin.
As for using real dollars, you need to have them. A lot of these ATI were paid by bitcoin mining otherwise they would be too expensive for them (as i said we are speaking about rigs with A LOT of ATI). Now they have paid them, made a nice gain and are using them on Folding.
Also remember the power consumption, paying the electricity bill for these monster is not cheap.
As for using real dollars, you need to have them. A lot of these ATI were paid by bitcoin mining otherwise they would be too expensive for them (as i said we are speaking about rigs with A LOT of ATI). Now they have paid them, made a nice gain and are using them on Folding.
Also remember the power consumption, paying the electricity bill for these monster is not cheap.
-
- Site Moderator
- Posts: 2850
- Joined: Mon Jul 18, 2011 4:44 am
- Hardware configuration: OS: Windows 10, Kubuntu 19.04
CPU: i7-6700k
GPU: GTX 970, GTX 1080 TI
RAM: 24 GB DDR4 - Location: Western Washington
Re: Six native petaFLOPS
Apologies. I for one didn't want to give the impression of attacking bitcoin; it's a perfectly valid and interesting project. In the very best case, Bitcoin may get its system perfected, reach the critical level of usage, and we all start switching over and solve many of the problems that are currently plaguing our currencies. It's just that as long as projects like Folding@home are around, I'm sticking with them instead. Thank you for explaining about the ATI GPUs, your reasons make sense and may accurately explain the current patterns. It's an interesting concept, having GPUs pay for themselves by fetching electronic currencies, but then again perhaps some of us feel that points are a form of currency so they might consider that they got their moneys worth out of their hardware that way as well. And I feel that at least for my rig, the power consumption for running F@h is worth it.gabi.2437 wrote:I'm not here to start a flame about bitcoin, i just stated that the ATI increase is due to bitcoin.
As for using real dollars, you need to have them. A lot of these ATI were paid by bitcoin mining otherwise they would be too expensive for them (as i said we are speaking about rigs with A LOT of ATI). Now they have paid them, made a nice gain and are using them on Folding.
Also remember the power consumption, paying the electricity bill for these monster is not cheap.
F@h is now the top computing platform on the planet and nothing unites people like a dedicated fight against a common enemy. This virus affects all of us. Lets end it together.
Re: Six native petaFLOPS
I know I'm reviving an oldish thread, but am reading up on F@H and found this. Wouldn't the increase in ATI GPUs be due to Llano?
-
- Posts: 10179
- Joined: Thu Nov 29, 2007 4:30 pm
- Hardware configuration: Intel i7-4770K @ 4.5 GHz, 16 GB DDR3-2133 Corsair Vengence (black/red), EVGA GTX 760 @ 1200 MHz, on an Asus Maximus VI Hero MB (black/red), in a blacked out Antec P280 Tower, with a Xigmatek Night Hawk (black) HSF, Seasonic 760w Platinum (black case, sleeves, wires), 4 SilenX 120mm Case fans with silicon fan gaskets and silicon mounts (all black), a 512GB Samsung SSD (black), and a 2TB Black Western Digital HD (silver/black).
- Location: Arizona
- Contact:
Re: Six native petaFLOPS
Hello dcdc, welcome to the forum.
Not likely. On-board chips don't fold well. Did ATI had a holiday sale? Did a popular game that does better with ATI cards release a new sequel so people upgraded? Updated fahcore_16 that performs better? [not lately, but possible] Even just a programming glitch in the stats? But hard to tell...
Not likely. On-board chips don't fold well. Did ATI had a holiday sale? Did a popular game that does better with ATI cards release a new sequel so people upgraded? Updated fahcore_16 that performs better? [not lately, but possible] Even just a programming glitch in the stats? But hard to tell...
How to provide enough information to get helpful support
Tell me and I forget. Teach me and I remember. Involve me and I learn.
Tell me and I forget. Teach me and I remember. Involve me and I learn.