I'd like to ramp up my crunching capacity, and could use some guidance. I posted a similar question on reddit boinc, and setting up a GPU machine was brought to my attention, and suggested I cross post in this forum.
I've rolled my own machines in the long, long past, like nearly two decades ago. I'm not afraid to do it but I'm certainly out of touch with the latest gear.
Any build guides around on getting started? The only caveat is I'd like to keep it Linux. And hopefully not break the bank.
Here's my other post: https://www.reddit.com/r/BOINC/comments/3sdgjn/question_whats_the_best_bang_for_the_buck/
Copyright © 2024 Einstein@Home. All rights reserved.
Looking to get started with GPU processing
)
Welcome John
You've saved a licence fee, so that's one less withdrawal from the bank! You will may be taxed in "other ways".
You might look on Boinc GPU forum and GPUgrid forums
I had good intentions about writing up my hardware experiences, i took some photos along the way and intended to write up, but i did do a short summary of Ubuntu / AMD7990 in the boinc GPU forum.
Electricity costs will surpass the purchase costs so I'm not sure if you had not thought of them as significant or electricity is free for you!
Some useful stuff here Boinc GPU computing
Crunching with large or multiple GPU cards uses two or three times the power of the rest of the system, so this generates heat and fan noise, a high quality psu will pay for itself in power saved over the life of the system.
To give you an example my AMD 7990 GPU runs at 86-89C even with 5 case fans and generates about 490W heat. Cooling the room also becomes a problem.
You say you want to spend a couple of hundred per month, does that mean you want to spend say 3600 on hardware for one cruncher to last 3 years or something else? Perhaps you might scroll through the stats page and get an idea on what hardware fits your expectations - in general older is cheaper to buy and slower and more power consuming to run.
I would not invest a lot on laptop GPU crunching - unless mobility was a needed.
AMD's latest GPUs have a driver(?) issue which prevents them running at their maximum throughput, which is why their numbers are scarcer in the top 100.
This thread about single GPU choices
Anyways good luck, ebay has plenty of gamers who want to upgrade their systems and that's a rich source of low cost equipment.
AgentB, Excellent! Thank
)
AgentB,
Excellent! Thank you for the tips. Sounds like you're the person I need to talk to regarding Linux. In another thread someone mentioned that AMD is better for E@H, but your AMD + Ubuntu build was not easy(plus the latest driver issue). I think I'd like my first GPU machine build to be a pleasant experience, more important than getting the best performance. Can you recommend a GPU card and CPU/motherboard combo that should play well with Ubuntu?
Thanks,
John
RE: AgentB, Excellent!
)
Well this forum has some very knowledgeable people who visit, so i have learnt a lot here.
It's not that simple, over time applications and hardware changes it swings one way then the other. Bugs get fixed, and my specific troubles may no longer exist. Currently its seems more Green then Red in my opinion - i will be replacing my trusty reliable GTX-460s at some point in the next 12 months, i'll probably replace them with nVidia if the status remains the same.
No, you haven't said what your budget is or what you want to build. I have a certain preference for hardware (and what i buy new or used) but that has nothing to do with Ubuntu - if that makes sense. Almost all the linux distros run well on modern hardware - the current AMD driver issue is a bit of an exception so i would probably avoid the latest AMD hardware for the moment, which is a shame (I secretly crave AMD Gemini but the driver issue will halve its performance)
I just took delivery today of a Intel Celeron 1037U CPU fanless 17 watt minicomputer. I would not recommend that! I'm not sure i should even crunch on it at all.
RE: I'd like to ramp up my
)
Hi John,
There are quite a few people around here, like AgentB, who are knowledgeable about the subject and generous with their time. You should have no problem getting enough information to guide you on your journey. The important thing is to do the planning before the purchasing. Avoid the trap of rushing in before you really understand all the possible pitfalls.
To aid people willing to help provide suitable recommendations, here are some questions for you to think about and respond to.
* Do you intend to focus on GPU crunching only or do you want some sort of optimum mix of CPU and GPU crunching?
* Do you really not have to worry about the cost of power or ultimately will it have to come out of your pocket in some way?
* A fair part of the capital cost of a fully functional computer is in peripherals not needed for crunching. Would you be happy to consider stripped down machines with just the essential components, perhaps in recycled cases?
* Do you have a lifetime in mind over which the equipment should be expected to function productively without undue risk of premature failure?
* Are you averse to having your 'production target' coming from more individual (but ultimately lower cost) machines?
* Where are you going to house your 'farm'? Have you considered the heat production and how you are going to deal with it?
* Do you have a total budget in mind for when your 'farm' is complete?
The above are for starters only. There will probably be more as things progress :-).
Your decision to use Linux is well founded. It's very simple to set up and easy to troubleshoot and maintain for someone with just basic CLI skills. I have a lot of GPUs, both AMD and NVIDIA and all running on Linux. These days I have no trouble installing/updating drivers for either type of card.
For a while I had felt that AMD were better performers (output per watt), but that may have changed in more recent times with the low power requirements of Maxwell2 GPUs. My problem with the latest NVIDIA cards is that they are quite a bit more expensive here than corresponding AMD cards so I haven't (yet) been tempted into a more thorough investigation. That may be quite different in your country where you should have access to better pricing.
Around 3 years ago, a user named Robert posted this message that contained a graph comparing various GPUs and their price and performance. The graph is no longer available by the look of it but I remember it well. The Y axis was a performance metric - perhaps credit/day and the X axis was cost.
Two prominent GPUs shown were a HD 7970 and a HD 7870. There was a straight line through the origin to the 7970 point and the 7870 lay very close to this line too. The line was described as something like the "maximum performance" line. There were several other (mainly NVIDIA I think) GPUs listed and most of these were well to the right of the line, particularly any higher end cards. In other words, for a given performance level, the NVIDIA card had a higher price. Correspondingly, for a given price level, the NVIDIA card had a lower performance. In those days, the 7970 was king. Interestingly, the 7870 (and also the 7850) were also pretty much on that best performance line.
One of my GTX650s was also on that graph and it wasn't too far away from the line so I was quite pleased to see that at the time. Any higher end NVIDIA cards were well below and to the right of the line. The clear message was that apart from the notable exception of the 7970, the high end cards were not good performers in the best bang for your buck stakes.
A lot of things have changed over the last three years so you really need to assess things afresh now. At one point, AMD GPUs benefited from driver improvements. The GPU apps themselves have improved considerably in a couple of stages as well. More recently, beta test versions of apps which use CUDA 5.5 libs have been found to give around a 20% further improvement for Kepler and Maxwell series GPUs. Maxwell2 have lower power requirements. It is quite a different ball game now so it's important to research thoroughly what you intend to purchase. It's a very good idea to try to find ex-gaming GPUs going cheaply on ebay because the owners must have the latest and greatest ;-).
I started crunching at Seti over 16 years ago in the 'classic' days well before BOINC. I moved to Einstein when it first opened to the public. Whilst the many pulsar discoveries made here have justified my faith and interest in this project, the real excitement is soon to happen when the advanced LIGO data (for gravity wave detection) is made available to the volunteers. This makes right now a very good time to be getting seriously involved with this project. I'm guessing there will likely be a GPU app for crunching future GW tasks. So far, all GW crunching has been CPU based. The GPU app might not happen right away so you may need some decent CPU performance as well to start with :-).
Cheers,
Gary.
RE: No, you haven't said
)
Well, just based on what I've learned in the last couple of days, I think what I'm looking for is a barebones Intel quad core with a case and motherboard ready for expansion. I'll probably start off with one card but I may get addicted, so, something with multiple GPU card slots, swapable PSU, etc. If I can do that, without the GPU, for around $500 I would be pleased. I'd prefer not to go used.
Any suggestions appreciated. And I've appreciated everything you've said already.
RE: The important thing is
)
Wow, Gary, thanks for the great insightful post and warm welcome. I'm certainly not wanting to rush and make a costly mistake. I don't want to wear out a welcome either :)
My thoughts have changed a little bit since my original (reddit) post. Originally I was thinking "farm" just because I like the idea of maintaining a network of servers for fun. But once the fellow there mentioned a GPU card blazes out 100k credits per day, that changed everything. I've been off/on here since 2010 and only churned out 300k! Based on what I've learned since then, such as the heat generation, a farm isn't what I have in mind for this project. And presently, that many credits a day is mind-blowing to me, so, one simple, barebones machine is probably a great start and introduction to the cost and heat issues while at the same time still amaze me.
Great questions...
Crunching only.
I don't have any requirements on this, but I'm not knowledgeable here. I do want to participate in LIGO when that happens.
I pay for electricity. I'm ready to accept the cost of 500W on 24/7.
Yes, I'd be very happy with a stripped down machine. My only concern is that my first build be pleasant. I'm afraid to buy a used gamer rig on my first try only to find out it was abused or I got scammed.
No lifetime in mind. But modular is better. I had a Dell PowerEdge SC420 that I maintained myself. I kept that beast humming for well over a decade, replacing components periodically.
I don't think so.
It's winter here and I could benefit from some warmth. It's a hot climate during the summer, I'm guessing I could only run at night then.
This question is kind of obsolete. Not really thinking 'farm' anymore. A single machine to introduce me to realities I think will work for now.
I look forward to more things! Thank you so much for your involved and detailed response. This is fantastic.
John
RE: RE: No, you haven't
)
I am in the US and have a local "Microcenter" store near me where I can get a 6 core AMD cpu, motherboard and 8gb of memory for around $200 to $250 dollars. Add in a power supply of say 650 watts or better, I buy 750 or 850 ones but they are closer to $100 each for the Bronze ones. The higher rates power supplies put out less heat. If you can reuse anything it will help, otherwise Newegg sometimes has decent large cases on sale for under $35 with free shipping. That put's you around $400 or so with no hard drive, but if you are JUST crunching a used one will work just fine, just setup a backup system and any crashes are easily recovered from. There are several free backup systems you can use, Acronis even has a free one for most major harddrives, but it only works for one brand at a time.
To be honest most projects are faster if you use an Intel based cpu, but the AMD ones are cheaper, in most cases, and will last just as long.
Of course there are several
)
Of course there are several way someone can go about "building" a system.
You could wait until Cyber monday and get some deals online from computer stores on components you would want.
NewEgg sometimes has bundle deals where you can get several items cheaper than you could individually. (I tend to use this method)
Then there is this site which will search the web for the best prices on individual components
https://pcpartpicker.com/
The whole intel vs amd chips is pretty mute if all you are going to do is crunch on the GPUs.
However, you said you wanted to do some gravitational waves on the CPU when they become available again. I have a mix of both types and found the Intel chips are much better suited to analyzing the Gravitational Waves (up to 100% faster in some cases) than the AMD chips.
So if you are still wanting to do that then I would go with an Intel.
Looking over your machines, you have a least one 4 core machine. Have you thought about just throwing a 750Ti half size in there? I used to have a Dell and it had 1 PCIe slot. PSU in Dells tend to be around 350, just barely enough to support that GPU as it doesn't require much power. But it ran for more than 2 year 24/7 until I donated it to a family member.
The 750Ti don't require a 6 or 8 pin as all the power comes from the PCIe slot
Good luck
Zalster
John, I've started to
)
John,
I've started to respond several times and stopped because there is just so much to consider that it is really hard to "just recommend something."
If you will look at my computers (which are not hidden) you will see that I have a "farm." Some is at one office, some at another, some at home, and some in another location, but they are all mine, stuck-together by me (or at least a GPU added in a couple of cases), and maintained by me.
My observation is: Building machines incrementally is the most expensive and frustrating way to do it. Let me give you some examples of why.
First you say, "This 550w power supply is $30 cheaper than this 850w power supply," and you buy it. Then you say, "I'd really like to add a card to that empty PCIe slot," and you have to go back to the store and buy the 750w you find on sale. Then, you notice that there is another PCIe slot, but you don't have another set of PCIe power plugs, so now you go back to the store and buy the 1,000w power supply that does have enough power. Then you put the third card in the machine and realize you can't keep the cards cool.
Now we start the same process with cases and fans... First you buy a marginally more expensive and larger case, then you get a bigger one, then a bigger one, and you add fans. Alternatively, you might go for a "hybrid" cooling solution you have to buy and add to each card separately, only to find that because of hoses you don't have as much room as you thought you did.
Oh, and you discover that "x" card that produces 60,000 credits per day cost you 70% of "y" card that produces 110,000 credits per day would have cost when you "settled and compromised," but now it's on sale for $10 more than you originally paid for "x."
...and gosh... a little overclocking...if you just had that mammoth CPU cooler that once you purchase it you discover hits the RAM and won't let your top GPU sit in its slot...
I'm not suggesting that you will really go through every one of those steps, but I'm just trying to illustrate that a build that "creeps" toward your goal is ultimately more expensive than setting your sights high and slowly building that monster-cruncher you really wanted in the first place.
As a second point, it is also easy to "aim low" and end-up with that leftover 500w power supply, a left-over motherboard that only has one PCIe slot, a case and fans, and now you "might as well" spend the money on a cheap CPU and RAM to go into it and put your less-able GPU in there because that is the least expensive route to having another machine to crunch with.
Before long you are surrounded by semi-okay crunchers, nowhere to put them, and you spend far more time keeping them all running and managing them than you meant-to.
If I were re-starting crunching, I would appreciate someone telling me a few things.
Look at the Statistics at the Top Hosts, and keep moving down the list until you are happy with the cost to build it (some are not that expensive to build) and set your sights on a similar build.
Everyone has a prejudice. Some people will only use "z" cases. Some people will only buy "w" GPUs. Some people are a little insecure and have to believe that what they have is "the only way to go," because they need to reassure themselves. There are dozens of ways of getting to the same place, so take all of that with a grain of salt, if it happens.
If you look at my machines you'll see almost exclusively AMD CPUs and some of them are very old. That's NOT because I think they are superior (the Intels are better), but because they are cheap. They have to be (for me) because I bought so many and couldn't afford a bunch of "good" CPUs. Also, the motherboards were cheap. It was simply the cheapest way to drive video cards.
...and that's almost exclusively what I do because the video cards are so much more productive than the CPUs.
So MY reasoning goes like this: If I spend money on a really nice CPU/motherboard combination, that's less money that I have to spend on GPUs which are going to out-produce the CPU 10-20:1. Seems like a bad trade-off to me; I'd rather have more or better GPUs. So, I bought cheap CPU/motherboard combinations.
OTHER people's reasoning might be that they are about to spend huge dollars on GPUs and don't want any excuse for the GPU not to perform at the very top of its game, so every piece of support hardware needs to be the best, fastest, most up-to-date hardware they can find.
I can see either train-of-thought as legitimate. I have to opt for cheap.
The whole GPU choice issue is related, but different.
If you had looked at the top hosts a year ago you would have been hard-pressed to find a NVIDIA-based machine in the top 40. Now that list is dominated by NVIDIA cards. That's not because the NVIDIA cards got better, but because pieces of the NVIDIA programs we run got much better. Right now, as you have already read, I'm sure, the top AMD cards are being hamstrung by some combination of code/architecture/driver issues. Once those get ironed-out (if they do), there will probably be a mixture of AMD/NVIDIA cards in the Top 50 machines.
But whether you choose to go AMD or NVIDIA, you should at least have a vague "plan."
My GUESS, and it is only a guess, and I'm only trying to be "conservative" in my guessing, is that you are going to very quickly realize the limitations of a bare-bones system and will wish you had done something else and not spent the money on a half-measure.
As to individual component selection... I hesitate to get involved. As I said earlier, there are a dozen ways of getting where you want to be.
But before you pull the wallet out of your pocket, I would look at the Top Host list and decide where you want to go. You don't have to copy one of them, but if you are comfortable with (hypothetically) three NVIDIA 970 cards in your system, then you need to be thinking FROM THE VERY START about cooling them, getting the right motherboard to support them, how much CPU you really need to feed them, what kind of power supply you'll need (even next year), etc.
That will keep you from buying three power supplies at a greater total cost than buying the right one to begin-with. "ABC" case on sale may look great now, but will it be large enough to house all three cards when you get them next year?
So... a couple of concepts:
The very top-of-the-line GPUs are the best, but not necessarily at the top of the price/performance curve. A couple of steps down is usually the better cost/unit output. Often, two of those will out-produce one of the very best cards for less cost BUT... they do take-up two slots instead of one. So now you have to ask whether the money saved is more valuable, or if the extra PCIe slot is more valuable so you don't have to build another computer to run another card. You had said you were comfortable pulling 500w, so I can tell you that two cards are probably your better choice. You're going to be able to get to 500w in a single system pretty easily.
The LEAST important thing is the speed of the hard drive. The next (in my experience) is the speed of the RAM. The next would be the speed of the CPU (within reason). The next would be the case. The next would be the power supply. THEN the GPU(s) is most important.
The case ranks so high because you *MUST PAY ATTENTION* to heat dissipation. If the case won't move the heat, it really doesn't matter what else you buy, it won't work.
That's my long, boring story, and I'm sticking to it.
We can help you narrow choices, but we need to know what compromises you are making in order to guide you.
I would encourage you NOT to spend ANY money on anything to start-with which is going to build a brick wall in front of your intentions. My fear is that a cheap bare-bones may do that.
Please go look at the Top Host list. I think you may be surprised by what you find.
Thanks,
)
Thanks, Zalster.
Good call! Hopefully I'll have a plan ready by then.
It's a laptop.
Good to know this is an option though with a general purpose machine, in case I come across a half-size 750Ti, I'll have a use for it.
I've learned the most awesome things in the last 24 hours!