GPU plotter Help
-
@haitch said:
@Redmogul You have to create your own. First run "gpuplotgenerator setup" and configure it for your card. Then start plotting with a command similar to this (adjust for your system):
gpuPlotGenerator.exe generate direct u:\4401562696129194441_106430464_7618560_163844401562696129194441 - replace with your numeric account ID
106430464 - nonce to start at. If this is your first plot use 0
7618560 - number of nonces to plot, must be a multiple of the stagger - the next field
16384 - stagger. Each 1 = 256 KB of memory, so divide by 4 to get amount of memory to use. In this case it's 4GB (4096 MB)@ haitch. In this example how much GB you are mining? I'm still confuse about nonces and stagger. Can you please guide me I will try to plot 920GB on a gpu XFX R9 280x.
acount ID: xxxxxxxxxxxxx
nonce to start: 715160321
number of nonces to plot: ?
stagger: ?My computer specs.
Intel core i3
Windows 10 Pro 64 bit
8gb RAM
GPU: R9 280X specs.: 2048 Stream Processors, 850 MHz Core Clock, 3GB 384-Bit DDR5
6000 MHz Effective Memory ClockThanks in advance
-
Can anyone give me sone direction on how to get the gpuPlotterGenerator to find my GPU? it seems to find my CPU I think. also does the GPU plots look any different than the CPU plots??
-
@Redmogul https://github.com/bhamon/gpuPlotGenerator/blob/master/README.md
[platformId] [deviceId] [globalWorkSize] [localWorkSize] [hashesNumber]
platformId: The platform id of the device. Can be retrieved with the [listPlatforms] command.
deviceId: The device id. Can be retrieved with the [listDevices] command.
-
@Blago
I appreciate the Info, but when I run the command to ListPlatforms it still only shows me my CPU
-
Show your 'junk' when asking for help. Others may be more likely to help. Ubuntu, Windows 7, 8, 10, What type of Video card? How much memory on your video card, What type CPU? How much memory on your motherboard?
-
@TurtleWinsRace OK Thank You for your suggestion. Here's the system info I'm working with.
Windows 7 Home Premium 64
Processor: AMD FX(tm)-8350 Eight-Core Processor Installed memory (RAM): 16.0 GB
Graphics Card: AMD Radeon R9 270X 4GB
-
@Redmogul said:
it still only shows me my CPU
Could you please be a bit more specific? How many platforms does it tell you that you have? Does it show like this?
-
@Propagandalf I really appreciate your help!!
I finally figured out that part, i was making it much more difficult than I should have <--new to this.
I am stuck on 1 last part which is how to figure out the stagger.
-
@Redmogul No problem, I'm glad to help! Have you already saved a configuration file?
The stagger is a value where you decide how much RAM to use. The amount of nonces you wish to plot must be a multiple of your stagger, otherwise it won't work. However, I do not know the formula for calculating stagger, but I believe haitch does.
@haitch Redmogul has 16 GB RAM and 4GB VRAM. How do you take RAM vs. VRAM into consideration when calculating stagger and what is the formula? If you decide to plot several files at once, should the stagger be changed for each file?
-
@Propagandalf ok cool
so this is what I have
gpuPlotGenerator generate buffer E://11928632418087677223_0_1860000_2000
pause
and tried to run it but got [ERROR] Invalid parameters count at line [1]
-
@Propagandalf
and used this as the device folder:
0 0 2000 2000 8192
-
@Redmogul said:
generate buffer E://11928632418087677223_0_1860000_2000
Preceding the rest of your file parameters, I believe the drive path should be with only one forward slash, and not a double back slash (e:). I also think that your stagger number is wrong, but I would prefer if someone else gave you some good intel on that.
@Redmogul said:
0 0 2000 2000 8192
Are you certain that you have selected the correct value for device platform and device ID? I'm not saying you haven't, but do check. That last value, 8192, can be reduced if you experience display driver crashes when trying to plot. I had to reduce mine all the way down to 900 before it worked.
-
@Propagandalf wow
I was following the video and the readme file for gpu plotting and thats the way they show it. I'll give it a try.
-
@Propagandalf same error message
-
@Redmogul I still don't know the exact answer, but wish I could help you! Have you figured it out yet?
-
@Propagandalf Thanks for the help, it was only 1 / and that seemed to fix it. now i have to figure out why i can't get the miner to work....lol
-
@Redmogul Glad to hear you figured out the plotting! Perhaps you could start a new thread with your mining issues if you need help. Remember to include as many details as possible, so that people can help more easily.
May the burst be with you!
-
@Redmogul
Hey I started this whole thing and I was downloading every miner plotter and wallet I could find and found myself pulling my hair out trying to write bat files do command lines and just plain trying to figure this stuff out. I found that the AIO (ALL IN ONE ) Client :
https://forums.burst-team.us/topic/16/hotfix-burst-client-for-windows-v0-3-1-all-in-one-wallet-plotting-mining
is your best bet, daWallet seems to be constantly working on this project and it seem to me the direction the whole project is going to make it turn key for the new users. Follow the video and it it step by step and you will be mining in 30 mins. (After you have plotted)Optional:
If you want to get started fast with the local wallet, I recommend you to download the latest package of the blockchain here: http://db.burst-team.us. You have to extract it to your C:\Users\Username\AppData\Roaming\BurstWallet\burst_db\ folder.
(The AppData folder is hidden! so you are going to have to go to explorer and change settings to see hidden files/ folders) and manually place it( Blockchain: DB.zip)as the above forum describes, it takes forever to download the BlockChain through the client.
Again I can't emphasize enough just follow the video and do everything through the client, (THE BOTTOM Left Buttons Plotting and Mining) . After plotting there are 3 settings for mining and I believe the third one down is for GPU mining. I honestly did not see a difference in the speed or the amount of Burst Deadlines I was finding while doing GPU mining, and my Video card was getting hella hot. I live in the desert and its hot enough as is without adding to climate and burning up my video cards:) I hope this helps. Have a good day,Kind Regards,
~LostBoy_SeekTime
-
@LostBoy
Good info! I was in the same boat as you when I started, and I did the same thing, hehe.@Redmogul
I just want to add that you don't have to use a GPU miner if your plots were plotted with a GPU plotter, you could also use a CPU miner.my Video card was getting hella hot
Unless you have an insane amount of TB I cannot imagine why your video card would get hot during mining, unless you are using your GPU for mining other cryptocurrencies at the same time. After all, Burst is POC and not POW.
-
@Propagandalf
Yea I was doing litecoins to. But I just turned that off cause I need to pack that machine for the move to Midwest.:) My burst machine will stay running until I light the torch and walk out the door:)Regards



