Providing WUs with down sized GPU memory usage!

WulfKnight
WulfKnight
Joined: 10 Mar 10
Posts: 2
Credit: 159721
RAC: 0
Topic 194918

Hi there,

I'm concerned about, how much GPU memoy is needed to run Einstein WUs on CUDA enabled systems.

I'd suggest to provide down sized WUs for computers, which doesn't have so much GPU RAM. For example 192 or 224 MB instead of 400 to 450 MB.

So nearly all CUDA compatible computers can provide a maximum of computation power to this project.

Thanks for reading,

hopefully

WulfKnight
Germany

Bernd Machenschalk
Bernd Machenschalk
Moderator
Administrator
Joined: 15 Oct 04
Posts: 4312
Credit: 250440722
RAC: 35187

Providing WUs with down sized GPU memory usage!

Sorry. We are aware that our requirements exclude some GPUs out there, but we can't work around the requirements of a CUFFT 'plan' for our FFT size.

BM

BM

WulfKnight
WulfKnight
Joined: 10 Mar 10
Posts: 2
Credit: 159721
RAC: 0

RE: Sorry. We are aware

Message 98001 in response to message 98000

Quote:

Sorry. We are aware that our requirements exclude some GPUs out there, but we can't work around the requirements of a CUFFT 'plan' for our FFT size.

BM

I can understand you in that point, but unfortunately not much participants own a big fat graphic card with tons of gigabyte of RAM.

You should think about the thousands and millions of computers, equipped with less than 512MB of graphical RAM. Not only mobile devices, much more desktop computers, too.

The BOINC platform was designed to reach a wide scale area of computers, to provide their unused computing power to help you reaching your goals.

So it's on you, providing apps, that can be used from a big count of computers and their users.

It began with CPU crunching, now the time for GPU has begone.

Write usable apps for us, we'll give you our computing power.

Thanks,

WulfKnight
Germany

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.