I noticed that after my screensaver is launched a few times and are subsequently deactivated so I can do work, the screensaver will work sluggishly afterwards and make the progress go much more slowly. It is acting like the graphics is running out of memory, so it has to start working with the AGP bus to raid the DRAM to do its work, leaving the rest of the computer without any memory bandwidth to do anything with. Could it be that it allocates new buffers on the graphics subsystem each time the graphics code is launched only to forget to delete them when it is stopped by the user to do something? The problem goes away when BOINC's scheduler shuts down Einstein@home to run another project like SETI@home. I am currently using an nVidia GeForce 2 go with 32MB running at 1600x1200x32 bits per pixel, so if there really is a memory leak, it eats memory up on my laptop's graphics subsystem quickly unless I set the screensaver preferences to not launch the BOINC screensaver.
Copyright © 2024 Einstein@Home. All rights reserved.
Graphics Code Memory Leak?
)
Well you might try updating your drivers, and I know from passed experience that the older Vid cards are not powerful enough to do all the processing of the graphics for these projects, so it will be using system memory and that does slow it down. I have a 6800 gt with 128 megs and I do not use graphics.
Link to Unofficial Wiki for BOINC, by Paul and Friends
Well you might try updating
)
Well you might try updating your drivers, and I know from passed experience that the older Vid cards are not powerful enough to do all the processing of the graphics for these projects, so it will be using system memory and that does slow it down. I have a 6800 gt with 128 megs and I do not use graphics.
I doubt that it has anything to do with the driver. SETI@home's graphics work fine every time the screensaver launches and quits, unless Einstein@home has launched its screensaver before and has not quit (like when you tell the preferences to leave suspended programs in the swap file). When it quits, Windows cleans up the leaked memory and then SETI@home works fine afterwards. If it was a problem with my driver or video subsystem, then I would have trouble every time when the screensaver launches Einstein@home's graphic code. On the first launch, Einstein@home's graphics code works fine. On subsequent launches without somehow forcing Einstein@home to quit (like when you tell the preferences to remove suspended programs from the swap file and it is time to swap the projects out, forcing the suspended one to quit; or when you reboot), my computer slows to a crawl. If it was the graphics driver's fault, I would have problems with both programs.
Anyways, there is enough memory in a 32MB graphics subsystem to do this, even at 1600x1200x32 bits per pixel. If double frame buffering was used, then around 22MB would be used (don't forget the Z-buffer, which ususally uses the same number of bits per pixel as the frame buffer). If triple frame buffering was used to improve the frame rate, then around 29MB would be used. Since almost no textures are used in either Einstein@home or SETI@home, my video card should be able to handle these graphics without raiding the main system memory. This evidence points to a possible graphics memory leak. Anyways, since I am using a GeForce 2 Go, I have to get my driver from Dell instead of using the reference driver from nVidia (which does not list mobile GPUs as supported GPUs), and I am using its latest customized driver as far as I know.