In process of chosing new VC: NVidia is really that bad in computing?

Dr.Alexx
Dr.Alexx
Joined: 14 Aug 05
Posts: 22
Credit: 5135173
RAC: 1
Topic 196544

Hello! Would someone tell me if GF670 is that bad in E@H comparing to 7950?

Taking in consideration everything but E@H I would prefere GF670 rather thatn R7950.

But test show that it is rly bad in computing. Is it so?

Jeroen
Jeroen
Joined: 25 Nov 05
Posts: 379
Credit: 740030628
RAC: 0

In process of chosing new VC: NVidia is really that bad in compu

Quote:

Hello! Would someone tell me if GF670 is that bad in E@H comparing to 7950?

Taking in consideration everything but E@H I would prefere GF670 rather thatn R7950.

But test show that it is rly bad in computing. Is it so?

Both cards should be a great choice for this project. However, to take the most advantage of either of these cards with Einstein BRP4, I would suggest running the card via a board and CPU that supports PCI-E 3.0 standard.

The one area that Kepler like the GTX 670 performs very poorly in is double precision math which is not used by this project. If you plan to run a project like Milkyway@Home, then double precision math is a requirement and the latest AMD cards perform far better than Kepler cards.

Dr.Alexx
Dr.Alexx
Joined: 14 Aug 05
Posts: 22
Credit: 5135173
RAC: 1

I run this on Sandy Bridge

I run this on Sandy Bridge 2600k. PCIE 2.0 only. Though, board is capable of acceptin Ivy Bridge (and, thus, PCIE 3.0) - don't want to have all this mess with CPU rearming right now. How much do I lose in this case?

Jeroen
Jeroen
Joined: 25 Nov 05
Posts: 379
Credit: 740030628
RAC: 0

With NVIDIA, the performance

With NVIDIA, the performance difference between PCI-E 2.0 and 3.0 is around 15%. I have not tested the AMD cards myself but I suspect that the OpenCL AMD application can benefit from the extra bandwidth like the CUDA application does.

You can still get very good performance out of these cards via PCI-E 2.0.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.