My 5700 plus 570 configuration ran successfully for almost a day at original settings, so I'm currently trying out a little power limitation on the 5700 and a lot of power limitation on the 570. Also, I finally thought to move my DVD player off the top of the case, where it was blocking completely one of the two top of case fans.
The current condition summary is a nominal GRP daily credit rate of 1.35M and box level power consumption of 318 watts. Currently, the 570 reports a GPU temperature of 70C, and the 5700 reports 63. Fan noise is reasonable with the 5700 turning at 42% and the 570 at 49%.
I've thought more and done a little investigation along the lines of "why are things so much happier with the 570 above the 5700 than below it?".
I've noticed that these cards drive far more of their exhaust air into the inside of the PC case relative to what they dump out the I/O panel than I imagined. The 5700 appears to send the airflow in the direction perpendicular to the motherboard plane, but the 570 directs a lot of flow sideways--right toward the top of the slot gap from which the 5700 is gulping input air if I am running the 5700 above configuration.
So I think a double 5700 system using this particular model of card is not so unthinkable as I did a day ago. First of all this card appears to have a much less unfavorable inside the case air exhaust direction than my 570. Second, I can throttle back power consumption quite a bit if need be. Third, I have side fans on my case I could crank up a bit (and they appear to have dirty filters I can clean to get more flow).
A more fanciful idea is to try intervening in the local airflow with some sort of dam. At the moment, I imagine that I could tape index cards onto the "lower" 5700 to try to fence away the immediate flow path down in the top 5700 input a bit. Just getting more mixing with the case interior air would help a good deal. The motherboard sensor has not yet gone above 40C, so the box is not running horribly hot.
Amazon's mind reader caught me looking again at the same model of XFX 5700, and correctly guessed that if they lowered my price to $300 they could sell me another one. Your price may differ.
I figure I'll try to get a dual 5700 setup to work in this box, and if I fail, I'll pull a pair of 570s out of my "second best box" and sub in the newest 5700. I'll lose a little in credit rate but save enough electric power to be a consolation prize.
I wonder whether Navi 10 crunching performance for gravity wave tasks would be better if it was run with a X570 chipset and a Ryzen (or Threadripper) 3000-series CPU to take advantage of PCIe 4 bandwidth. Or would any meaningful hardware-based GW performance gain come mainly from a faster CPU, regardless of PCIe bandwidth? Any thoughts?
Ideas are not fixed, nor should they be; we live in model-dependent reality.
I’m not sure if the AMD app is any different, but on my Nvidia cards, the only noticeable speed improvement came from adding a faster CPU.
PCIe bandwidth use is really low, and not the source of the bottleneck. All things being equal, PCIe 3.0 vs 4.0 will likely make no difference. You could even cut it down to PCIe 2.0 or 1.0 with no impact.
My 5700 box has been running two RX 5700 cards for the last few hours. Both are the same model XFX Rx-57XL8LBD6. Thinking of the severe trouble I had when I stationed one of these cards above an XFX 570 card in the same box, I proceeded with great thermal caution. I vacuumed filters, turned up the case fan speeds, and made an initial power limit command to both cards of -50%.
As time has gone by, I've felt confident enough to relax the power limit progressively. Currently the lower card is at a power limitation command of -35%, and the upper (hotter) card at -40%. At this combination, the nominal Einstein GRP daily credit rate is over 1.7M, which is slightly higher than I ever got my Radeon VII to give in the same box.
At -50/-50, the box power consumption at the wall was about 308 watts, and does not look like it will go much over 400 watts. Right now the power meter reads about 360 watts.
So a double 5700 configuration at decent productivity is definitely possible in this box. From a shared space thermal point of view, I think this particular card design is far from ideal, though I don't know how it compares in that respect to other available RX 5700 cards.
I have a long history of running two cards in a box, starting in the NVidia Maxwell generation when I noticed that the Nvidia price/performance curve gave much more output/$ in the midrange than the top-end cards, and that pretty ordinary motherboards and case configurations would support them. I think the particular Nvidia cards I used two to a box over the years all dumped a much higher fraction of their heat out the I/O panel than does this particular XFX 5700 design, and also the preceding XFX 570 design I've been using.
I mentioned that this 5700 has a major airflow opening atop the card. Today, with the card in my hand ready to install, I noticed it also has a similar size and shape opening in the bottom. So if the general pattern of airflow in your case is from the bottom toward the top, this 5700 is dumping hot air positioned to flow into the input of the next card up.
Before the second 5700 card arrived, I had a first try at redirecting airflow using a couple of index cards taped to a 5700 positioned below a 570. That attempt was a failure, as the 570 ran 3C hotter with it in place than before. I have little doubt that it is possible to rearrange airflow to make this problem better but don't have a current viable concept other than the obvious one of substituting higher flow case fans for the ones currently installed. My case has a lot of fans (two front, one bottom, two left side, one back, two top) but they are mostly somewhat slow and quiet. I'd like to keep them that way. If you like to run an open case with a big industrial fan helping out, all this is probably not a problem at all. But it caught me by surprise, so I thought I'd warn the reader a little.
Has anyone been able to get a 5700 working on a Linux host? The latest version of the Linux drivers from AMD is 19.50 and the Windows driver is on version 20.2.1. I think it was the change from 19.x to 20.x that fixed some of the driver issues for Einstein.
Has anyone been able to get a 5700 working on a Linux host?
I don't recall seeing anyone reporting success running under Linux. I tend to stay away from new hardware until the initial bugs/deficiencies/excessive costs get sorted out :-).
Do you have one that you can't get going, or are you just thinking about one? :-) From occasional browsing over at Phoronix, I get the impression that things are improving, but you'd probably need to be close to the bleeding edge to get a workable system.
I don't think AMD has released an updated Linux driver. they released drivers that are fixed in Windows, but maybe not enough people run Linux with these cards so they never got reports that the linux drivers were broken also.
I think you need to be up on the latest 5.4 kernel and be paired with the latest mesa-opencl driver. I saw this update cross a machine I have the Mesa drivers installed alongside the Nvidia drivers.
mesa-opencl-icd:amd64 (19.2.1-1ubuntu1~18.04.1)
But that is just with the current 5.3.0-28-generic kernel. I think you need the 20.1 drivers for the OpenCL to work properly on Navi.
I think someone needs to ask directly in the Phoronix forums whether the current shipping Mesa 19.2.1 driver in the Ubuntu distro works on OpenCL. They have never revisited the OpenCL performance on the current shipping drivers since the release of the Navi cards.
They have only had articles on the beta drivers shipping with the upstream 5.4 kernel and they have only tested the gaming performance.
A question needs to be answered about whether the OpenCL is still broken on the drivers that are available without going upstream.
My 5700 plus 570
)
My 5700 plus 570 configuration ran successfully for almost a day at original settings, so I'm currently trying out a little power limitation on the 5700 and a lot of power limitation on the 570. Also, I finally thought to move my DVD player off the top of the case, where it was blocking completely one of the two top of case fans.
The current condition summary is a nominal GRP daily credit rate of 1.35M and box level power consumption of 318 watts. Currently, the 570 reports a GPU temperature of 70C, and the 5700 reports 63. Fan noise is reasonable with the 5700 turning at 42% and the 570 at 49%.
I've thought more and done a little investigation along the lines of "why are things so much happier with the 570 above the 5700 than below it?".
I've noticed that these cards drive far more of their exhaust air into the inside of the PC case relative to what they dump out the I/O panel than I imagined. The 5700 appears to send the airflow in the direction perpendicular to the motherboard plane, but the 570 directs a lot of flow sideways--right toward the top of the slot gap from which the 5700 is gulping input air if I am running the 5700 above configuration.
So I think a double 5700 system using this particular model of card is not so unthinkable as I did a day ago. First of all this card appears to have a much less unfavorable inside the case air exhaust direction than my 570. Second, I can throttle back power consumption quite a bit if need be. Third, I have side fans on my case I could crank up a bit (and they appear to have dirty filters I can clean to get more flow).
A more fanciful idea is to try intervening in the local airflow with some sort of dam. At the moment, I imagine that I could tape index cards onto the "lower" 5700 to try to fence away the immediate flow path down in the top 5700 input a bit. Just getting more mixing with the case interior air would help a good deal. The motherboard sensor has not yet gone above 40C, so the box is not running horribly hot.
Amazon's mind reader caught me looking again at the same model of XFX 5700, and correctly guessed that if they lowered my price to $300 they could sell me another one. Your price may differ.
I figure I'll try to get a dual 5700 setup to work in this box, and if I fail, I'll pull a pair of 570s out of my "second best box" and sub in the newest 5700. I'll lose a little in credit rate but save enough electric power to be a consolation prize.
I wonder whether Navi 10
)
I wonder whether Navi 10 crunching performance for gravity wave tasks would be better if it was run with a X570 chipset and a Ryzen (or Threadripper) 3000-series CPU to take advantage of PCIe 4 bandwidth. Or would any meaningful hardware-based GW performance gain come mainly from a faster CPU, regardless of PCIe bandwidth? Any thoughts?
Ideas are not fixed, nor should they be; we live in model-dependent reality.
I’m not sure if the AMD app
)
I’m not sure if the AMD app is any different, but on my Nvidia cards, the only noticeable speed improvement came from adding a faster CPU.
PCIe bandwidth use is really low, and not the source of the bottleneck. All things being equal, PCIe 3.0 vs 4.0 will likely make no difference. You could even cut it down to PCIe 2.0 or 1.0 with no impact.
_________________________________________________________________________
My 5700 box has been running
)
My 5700 box has been running two RX 5700 cards for the last few hours. Both are the same model XFX Rx-57XL8LBD6. Thinking of the severe trouble I had when I stationed one of these cards above an XFX 570 card in the same box, I proceeded with great thermal caution. I vacuumed filters, turned up the case fan speeds, and made an initial power limit command to both cards of -50%.
As time has gone by, I've felt confident enough to relax the power limit progressively. Currently the lower card is at a power limitation command of -35%, and the upper (hotter) card at -40%. At this combination, the nominal Einstein GRP daily credit rate is over 1.7M, which is slightly higher than I ever got my Radeon VII to give in the same box.
At -50/-50, the box power consumption at the wall was about 308 watts, and does not look like it will go much over 400 watts. Right now the power meter reads about 360 watts.
So a double 5700 configuration at decent productivity is definitely possible in this box. From a shared space thermal point of view, I think this particular card design is far from ideal, though I don't know how it compares in that respect to other available RX 5700 cards.
I have a long history of running two cards in a box, starting in the NVidia Maxwell generation when I noticed that the Nvidia price/performance curve gave much more output/$ in the midrange than the top-end cards, and that pretty ordinary motherboards and case configurations would support them. I think the particular Nvidia cards I used two to a box over the years all dumped a much higher fraction of their heat out the I/O panel than does this particular XFX 5700 design, and also the preceding XFX 570 design I've been using.
I mentioned that this 5700 has a major airflow opening atop the card. Today, with the card in my hand ready to install, I noticed it also has a similar size and shape opening in the bottom. So if the general pattern of airflow in your case is from the bottom toward the top, this 5700 is dumping hot air positioned to flow into the input of the next card up.
Before the second 5700 card arrived, I had a first try at redirecting airflow using a couple of index cards taped to a 5700 positioned below a 570. That attempt was a failure, as the 570 ran 3C hotter with it in place than before. I have little doubt that it is possible to rearrange airflow to make this problem better but don't have a current viable concept other than the obvious one of substituting higher flow case fans for the ones currently installed. My case has a lot of fans (two front, one bottom, two left side, one back, two top) but they are mostly somewhat slow and quiet. I'd like to keep them that way. If you like to run an open case with a big industrial fan helping out, all this is probably not a problem at all. But it caught me by surprise, so I thought I'd warn the reader a little.
Has anyone been able to get a
)
Has anyone been able to get a 5700 working on a Linux host? The latest version of the Linux drivers from AMD is 19.50 and the Windows driver is on version 20.2.1. I think it was the change from 19.x to 20.x that fixed some of the driver issues for Einstein.
Ryan
n12365 wrote:Has anyone been
)
I don't recall seeing anyone reporting success running under Linux. I tend to stay away from new hardware until the initial bugs/deficiencies/excessive costs get sorted out :-).
Do you have one that you can't get going, or are you just thinking about one? :-) From occasional browsing over at Phoronix, I get the impression that things are improving, but you'd probably need to be close to the bleeding edge to get a workable system.
Cheers,
Gary.
I don't think AMD has
)
I don't think AMD has released an updated Linux driver. they released drivers that are fixed in Windows, but maybe not enough people run Linux with these cards so they never got reports that the linux drivers were broken also.
_________________________________________________________________________
I think you need to be up on
)
I think you need to be up on the latest 5.4 kernel and be paired with the latest mesa-opencl driver. I saw this update cross a machine I have the Mesa drivers installed alongside the Nvidia drivers.
mesa-opencl-icd:amd64 (19.2.1-1ubuntu1~18.04.1)
But that is just with the current 5.3.0-28-generic kernel. I think you need the 20.1 drivers for the OpenCL to work properly on Navi.
Gary Roberts wrote:Do you
)
I am interested in purchasing one, but I am going to wait until a 20.x Linux driver is released.
I think someone needs to ask
)
I think someone needs to ask directly in the Phoronix forums whether the current shipping Mesa 19.2.1 driver in the Ubuntu distro works on OpenCL. They have never revisited the OpenCL performance on the current shipping drivers since the release of the Navi cards.
They have only had articles on the beta drivers shipping with the upstream 5.4 kernel and they have only tested the gaming performance.
A question needs to be answered about whether the OpenCL is still broken on the drivers that are available without going upstream.