Slow unit times (GTX 970)

Ryan
Ryan
Joined: 25 Nov 14
Posts: 37
Credit: 147735836
RAC: 74439
Topic 197879

So I replaced my r9 270 with a GTX 970 yesterday and I am seeing some slow unit times.

Comparing it to my 290x so im guessing its not a fair comparison though I would expect both to perform about the same.

Running 3 units on each card, interestingly the 290x is using 0.5 CPU per unit and the 970 only 0.5?, any way to get them to both use the same amount of CPU time per unit for a fair test?

Anyhow results are :

Arecibo on 290x around 40 - 50 mins per unit

Arecibo on 970 aound 1:05 per unit

Perseus on 290x around 2 - 2.5 hours per unit

Perseus on 970 around 4 hours 40

Sound right? I would have thought they would perform about the same?

archae86
archae86
Joined: 6 Dec 05
Posts: 3159
Credit: 7245226686
RAC: 1310104

Slow unit times (GTX 970)

Quote:

Running 3 units on each card

Perseus on 970 around 4 hours 40


I have a SC 970 running on a Haswell two-core motherboard. Perseus elapsed times running 3X range tightly in the 3:57 to 4:02 elapsed time area.

I don't run a mix of types. I run just two CPU jobs, held by processor affinity to one side of the hyperthreaded cores, in an effort to hold down latency on the GPU jobs from delay in CPU support.

I don't think Maxwell 2 cards and the current Einstein Perseus code are good friends. Perhaps hoped-for Cuda 5.5 versions coming soon will help this. Maybe not.

Ryan
Ryan
Joined: 25 Nov 14
Posts: 37
Credit: 147735836
RAC: 74439

Ahh ok so this might be

Ahh ok so this might be normal then?

looked at the compute power of the card 3494 gflops compared to 5632 gflops of the 290x

Must be heavily optimized for gaming and not so much for compute work.

Gary Roberts
Gary Roberts
Moderator
Joined: 9 Feb 05
Posts: 5874
Credit: 117983418199
RAC: 21413863

RE: Running 3 units on each

Quote:
Running 3 units on each card, interestingly the 290x is using 0.5 CPU per unit and the 970 only 0.5?, any way to get them to both use the same amount of CPU time per unit for a fair test?


The 290x isn't 'using' 0.5 CPUs, necessarily. Each task comes with a 'recommendation' from the project of the likely CPU support needed. BOINC uses this value to calculate how many CPU cores should be prevented from crunching CPU tasks in order to provide this support. In doing the calculation, BOINC disregards fractional cores. For AMD GPUs the recommendation is 0.5 CPUs per GPU task. For nvidia, it's 0.2 CPUs per GPU task.

So for running 3x on AMD, BOINC would only free-up one CPU core. You would probably get better performance by running 4x and having BOINC free-up two full cores. For running 3x or 4x on nvidia, BOINC wouldn't reserve any cores. Of course, all of this becomes irrelevant if you're not doing CPU crunching anyway. At the moment your 970 host doesn't seem to have CPU tasks whereas your 290x one does.

Also, you have got more pressing things to worry about, it would seem. On your 290x host, since around Dec 12th, all returned GPU results presented for validation have ended up as validate errors or invalid.

Cheers,
Gary.

ExtraTerrestrial Apes
ExtraTerrestria...
Joined: 10 Nov 04
Posts: 770
Credit: 580331504
RAC: 123890

In gaming those GPUs are

In gaming those GPUs are about equal. Number crunching always greatly depends on the actual code. For Einstein lot's of memory bandwidth is needed - which is a weak spot of GTX970/980 and a strong point of R9 290X.

And for Arecibo tasks your GTX970 may actually be more efficient, despite the performance penalty. Usually 290X uses lot's of energy, about 250 W. GTX970 should be set to use 145 W by default, whereas most factory overclocked ones are set to 160 W. Not sure about yours, but it's very probably costing you significantly less to run.

BTW: take a look at this, if you haven't yet.

MrS

Scanning for our furry friends since Jan 2002

disturber
disturber
Joined: 26 Oct 14
Posts: 30
Credit: 57155818
RAC: 0

I have the Gigabyte version

I have the Gigabyte version of the 970 card. Using TechPowerUp GPU-Z, it reads that the card uses about 47% of TDP. Since the max TDP is 160W, then the card is drawing less then 80W of power. My gutfeel says that's about right since the temps are only 56C with a fan speed of 60%

Also if you follow these posts

http://einsteinathome.org/node/197852

it shows how to increase the memory clock what the card normally run in the game mode. For some reason the memory clock is reduced in the compute mode.

ExtraTerrestrial Apes
ExtraTerrestria...
Joined: 10 Nov 04
Posts: 770
Credit: 580331504
RAC: 123890

RE: I have the Gigabyte

Quote:
I have the Gigabyte version of the 970 card. Using TechPowerUp GPU-Z, it reads that the card uses about 47% of TDP. Since the max TDP is 160W, then the card is drawing less then 80W of power. My gutfeel says that's about right since the temps are only 56C with a fan speed of 60%


Probably not. the Gigabyte G1 Gaming has a TDP of 300 W and I haven't seen any smaller GTX970 built by them. This would mean a power draw of ~140 W, which would not contradict your temperatures. The G1 Gaming has massive cooling, which is also why it's the longest of all GTX970 currently available.

MrS

Scanning for our furry friends since Jan 2002

disturber
disturber
Joined: 26 Oct 14
Posts: 30
Credit: 57155818
RAC: 0

You are right. I did a quick

You are right. I did a quick check with my UPS that can measure the power. I read the power draw with everything running, then on the activity tab I set the gpu to off and read the power draw again. So it dropped from 274W to 140W. That is 134W, then add 15W idle power gathered from reviews and I get 150W, very close to what you mentioned. This is with the full memory clock rate.

And yes the card is massive and quite long. It is a good thing that the power connectors are on the top.

ExtraTerrestrial Apes
ExtraTerrestria...
Joined: 10 Nov 04
Posts: 770
Credit: 580331504
RAC: 123890

That's still very good

That's still very good :)

BTW: those 150 W also include the PSU inefficiency. Assuming a 80+ Gold unit under moderate load (91% efficiency) would mean your card consumed 137 W (and reports this value, since it doesn't know which PSU you have) and the PSU 13 W.

MrS

Scanning for our furry friends since Jan 2002

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.