I like to get most out of my crunching machine. So I keep record of statistics. Due to the recent credit drop, previous stats became useless. Unless someone can tell me the relationship between old and new credits..
Is there some (general) equation between the old and new credit system?
Thanks in advance,
Bert
Somnio ergo sum
Copyright © 2024 Einstein@Home. All rights reserved.
How much credit drop?
)
4.24 uses 66% of the time that 4.02 used
4.24 receives 66% of the credit that 4.02 received
4.02 received 178.49 credit
4.24 receives 121.89 credit
So it still works out to about the same credit per hour
98SE XP2500+ @ 2.1 GHz Boinc v5.8.8
![](http://www.boincsynergy.com/images/stats/comb-643.jpg)
![](http://img.uptime-project.net/img/8/84947.png)
RE: 4.24 uses 66% of the
)
4.17 on linux uses 87% of the time 4.0X used (21500 vs 24500 average on my turion MT-40).
At 148 credits I was having a slight drop in credit per hour. Now at 122 the cph difference is evident. Not that I care much, the point is that you just can't assume the credit per hour hasn't changed.
I don't know of any general equation for fitting your stats, as the performance improvement vary from one cpu to the other.
RE: RE: 4.24 uses 66% of
)
Linux 4.0x application was already faster than the Windows application. The newest application brings both platforms closer together. Linux saw a smaller drop, but were making 10-15% more than window counterparts. Now that the playing fields are more level, the crediting should also be. So, like machines should perform pretty close to each other, now.