Hi there,
I'm concerned about, how much GPU memoy is needed to run Einstein WUs on CUDA enabled systems.
I'd suggest to provide down sized WUs for computers, which doesn't have so much GPU RAM. For example 192 or 224 MB instead of 400 to 450 MB.
So nearly all CUDA compatible computers can provide a maximum of computation power to this project.
Thanks for reading,
hopefully
WulfKnight
Germany
Copyright © 2024 Einstein@Home. All rights reserved.
Providing WUs with down sized GPU memory usage!
)
Sorry. We are aware that our requirements exclude some GPUs out there, but we can't work around the requirements of a CUFFT 'plan' for our FFT size.
BM
BM
RE: Sorry. We are aware
)
I can understand you in that point, but unfortunately not much participants own a big fat graphic card with tons of gigabyte of RAM.
You should think about the thousands and millions of computers, equipped with less than 512MB of graphical RAM. Not only mobile devices, much more desktop computers, too.
The BOINC platform was designed to reach a wide scale area of computers, to provide their unused computing power to help you reaching your goals.
So it's on you, providing apps, that can be used from a big count of computers and their users.
It began with CPU crunching, now the time for GPU has begone.
Write usable apps for us, we'll give you our computing power.
Thanks,
WulfKnight
Germany