taking forever!!!
-
Yes, as @HiDevin said, take the pic when the n/min are displayed. At 8000 n/m you should plot a TB in 8 hours, if your at 24 hours then you must have about 2600 n/min, assuming no file xfer bottleneck.
Running 3 of 4 cores reduces your speed by 25%, so run 4 cores and set the plotter to "below normal" priority to be able to run other programs without freezing up.
-
-
@rds said in taking forever!!!:
Yes, as @HiDevin said, take the pic when the n/min are displayed. At 8000 n/m you should plot a TB in 8 hours, if your at 24 hours then you must have about 2600 n/min, assuming no file xfer bottleneck.
Running 3 of 4 cores reduces your speed by 25%, so run 4 cores and set the plotter to "below normal" priority to be able to run other programs without freezing up.
how do i set the priority?
-
@jhip626 ,
go to task manager, then details tab, then right click the plotter and set priority to below normal.
-
@rds said in taking forever!!!:
@jhip626 ,
go to task manager, then details tab, then right click the plotter and set priority to below normal.
cool thanks! now i know that for next time...
-
@jhip626 said in taking forever!!!:
a10 7850k
With that processor it seems pretty normal - if you have discrete gpu you could use the gpu plotter to speed things up a bit.
-
don't worry my plotting is 3000 nonces too, i don't feel it takes that long tough :P
-
@vier23 said in taking forever!!!:
@jhip626 said in taking forever!!!:
a10 7850k
With that processor it seems pretty normal - if you have discrete gpu you could use the gpu plotter to speed things up a bit.
NO, the gpu is in the processor, but its something like 8 cores goes to gpu processing and 4 go to cpu... so maybe I can try gpu processing with it and see if it will work.
-
@jhip626 i'm not sure if gpu plotting works with APUs, I think you need a discrete graphics card. Someone else with better knowledge regarding this should chime in though.
-
@vier23 said in taking forever!!!:
@jhip626 i'm not sure if gpu plotting works with APUs, I think you need a discrete graphics card. Someone else with better knowledge regarding this should chime in though.
I have two machines, one has an integrated GPU and one is a real one. I choose to mine both with the GPU but plot with the Xplotter CPU style. My nonces/min, are 3000 on a $350 laptop and 8000 n/min on my bigger AMD machine.

