What am I doing wrong? GPU Plotting
-
@kohai & @jervis Thank you very much for the help! I think the new settings and some patience will help. I was thinking right away I would see thousands/minute flying by me. I will let you know what happens.
-
still not working right. It plotted for hours and after like 12 hours was still only like 1200 nonces/minute. It only processed a 88 GB file after all that time. Any other ideas? Not sure what else to try.
-
just let be man. Remember, it will take more than 12 hours for you to plot that in buffer and then take a whole day to optimize.
-
@jervis Ok but I am doing direct, is that still the case?
-
I have switched back to cpu plotting (old reliable). This is starting to cost me BURSTCOIN :) I do appreciate the help. I bought the card to do some simultaneous ZCASH mining and to get faster plotting which is working great. Earn some ZCASH to buy BURSTCOIN! Unfortunately, I find the GPU plotting process very cluggy. It definitely does not go as smooth as the videos I saw. The calculations that show are ridiculous and takes to long to show something reasonable so you do not know until after days or a lot of hours whether it is working. CPU plotting at least is straight up and works, just takes a long time. I will stick with that for now.
-
@marrada I did the same. CPU plotting is slow and steady but it gets there every time.
-
@marrada you can try device.txt as
0 0 7168 128 4096or even
0 0 7168 256 40967168 is equal 7GB memory - i would advice not use full amount of possible memory (first most manufacturers counts GB in simplified maner, and second you would still want to use your pc ;) )
-
@marrada For your devices.txt I'd go with
0 0 8192 64 8192
I started GPU plotting 2 * 6TB drives yesterday morning - it took almost 24 hours to the initial build of the files, but now the files are being filled in at approx 33,000 Nonce/min. In your first screen shot, you've completed the initial build, and are now filling in - that 960 figure will continue rising, right up to you plot the last nonce.@Lithstud, actually 8192 is 2GB of ram - the number represents the number of scoops to plot - so with a scoop being 256K, 4096 is 1GB, not 4GB.
-
@haitch ahh didnt know that :) for some reason i allways thought that one is kinda letting you to limit used RAM on video card :) but again for me it worked even if i was wrongly calculating it :)
-
@haitch @LithStud @Propagandalf @luxe @gpedro Okay, I have finished plotting using the GPU but then this is what happens when i mine with it:
[ERROR 1007] The deadline for your nonce is REALLY BAD: 144922 years, 10 months, 13 days, 5 hours, 3 mins, 8 secs - wrong block? are your plot files corrupted?what did I do wrong? This is the first GPU plotting I've made.
I used direct. And it took 3 days to optimize after plotting. Does that really happen? The direct GPU plotting automatically optimize the drive right?
C:\>PlotsChecker.exe d:\ file: 14946811813230596345_900000000_22700032_22700032 need to delete and replotGuys, I need help here. Please do respond when you're able. Thank you. :)
-
@jervis as far as i know if you do direct ploting you dont neet to optimize. So its either that or because of 22700032 you cant divide 900000000 with it and thats not good.
I might be wrong as i havent ploted big drive yet nor used direct ploting :)
-
@LithStud yeah, blago whooped it into my head in alttech chat. lol! I'm good now. Thanks man.



