i tried uninstalling all nvidia gpu and cuda and einstine@home/BOINC files... reisntalled everything booted.
still running on cpu.
the client says
cuda device found
coprocessor:9800gtx/9800gtx+ (1)
no general perefences found - using BOINC defaults.
any help? i have overclock gpu it would score alot of points running 247
thanks!
AKA Warlord420
Copyright © 2024 Einstein@Home. All rights reserved.
cant get it to use gpu.
)
As of now, E@H does not provide a GPU-application, only SETI and GPUGRID do.
Michael
Team Linux Users Everywhere
RE: As of now, E@H does not
)
Yea - they do not, but there is some misleading PR out about einstein using GPU technology. Einstein is mentioned here and also here and it appears at first glance that the CUDA stuff is available.
That link (above) supplied by nvidia does not lead to any Einstein GPU or CUDA info.
It would be nice if all these projects started using modern tech. Especially that climate predition project. If they got away from fortran and switched to the latest technology maybe their computations could speed up and they would discover we are in an ice age instead of global warming.
sorry for my rant
In SETI I crunched a WU in
)
In SETI I crunched a WU in 10k s. Another wingman did it in 14k s. Neither of us used a graphic board and CUDA. A third wingman, using a quad processor and 4 graphic boards of the latest generation crunched it in 84 s. But his result was judged invalid and he got zero credits. So, if climateprediction.net started using GPUs maybe we would justly end up in an ice age.
Tullio
RE: ...crunched it in 84
)
He certainly didn't crunch it in 84 s. His CPU needed 84 s to feed the GPU with data. The time needed to crunch a job isn't recorded with SETI CUDA (AFAIK :-)
Gruß,
Gundolf
Computer sind nicht alles im Leben. (Kleiner Scherz)
BeemerBiker wrote:If they got
)
Do you know why their applications were programmed in Fortran?
It's over a million lines of code that written by the scientists working on this problem, tried and tested, over a couple of decades, so they could run it on a Cray supercomputer.
To rewrite the applications in another language would take multiple more years, if not decades, nothing said about the man-hours in that and the money that it would cost. You want to pick up that tab?
Besides, even if it could be done, you'd need a pretty hefty card, with at least 750MB of memory on it as that's what some of those models take up in main memory. So again not for everyone then.
RE: BeemerBiker wrote:If
)
I was joking of course. However, it would be nice to check their computations much like is done here. There is a discussion about this very problem here. The problem with the humongus lines of fortran code is that (amoung other problems) it is difficult to check their figures ie: be able to duplicate their work and verify it. That is what the scientific method is about - being able to duplicate the same result or experiment every time. Since the fortran code is not being made available one is reduced to reverse engineering the conclusions or observing, for example, that temperature averages were consistently rounded up in the last digit or that random numbers from a phone book can produce the same temperature expectations over the next decade using the Gore or hockeystick climate model.