Xplotter like software for linux
-
Hi,
i currently use Xplotter from the wallet to generate optimized plots by default, so no need to optimize later.
I'm planning to deploy a linux system, but i didnt find much info about generating optimized plots with something like mdcct.
i'm aware optimized plots could be generated by setting stagger size to the same size as nonce size. I tested this with gpuplotter which didnt work because it would have required imense amounts of RAM for 8tb to plot.
I hope there is a way to plot a single (optimized) 8tb plot in linux without like 1tb of ram?cheers
-
use: https://github.com/Mirkic7/mdcct
It generates pre optimized plots unlike the GPU plotters. ALso you dont need an insane amount of ram to plot but the more you can buffer the better
-
just to confirm: plot cmd would look something like this?
./plot -k KEY -x 0 -d /some/path/to/plots -s 0 -n 12345678 -m 12345678 -t 1 -a
This will generate an optimized plot with 12345678 nonces so only a single disk head move is required for each round. This will also work with standard 4-8GB RAM.
Is that correct?
-
yup you only you can ignore the -m and -a if you want. also -k I believe. but thats the general sequentence.
-
Omitting
-mwould result in m=n by default?
-kis required i believe, its the account id if i'm correct
-
Ah that's right you -m is basically the amount of mem you want to use. sorry haven't used the plotter is a while but if you check the original repo for the plotter a better read me is available for it. The flags are the same so just use the new plotter wit hthe old docs:
https://github.com/BurstTools/BurstSoftware/blob/master/README.md
-
[stagger size] = set it to 2x the amount of MB RAM your system has (with async 1x the RAM your system has)
suggests its not possible to generate optimized plots? stagger size needs to be 8TB for optimized plots on a 8TB disk
-
yeah the *nix support isn't the best but I wouldn't worry to much about the optimising it does help read speeds but shouldn't be a bottleneck.
-
@felixbrucker said in Xplotter like software for linux:
[stagger size] = set it to 2x the amount of MB RAM your system has (with async 1x the RAM your system has)
suggests its not possible to generate optimized plots? stagger size needs to be 8TB for optimized plots on a 8TB disk
There are two ways;
either you throw away a lot of computation power by generating the data stream for an "optimized" file
or
you pre-allocate the filespace (with all-zeroes), then compute all nonces and write them to disk, into
the pre-allocated file, jumping all over the place. (obviously doesn't work on compressed filesystems).The algo computes all nonces with a stagger of ONE.
All plotting software out there "optimizes" that, giving you a parameter for RAM usage (stagger).
-
@manfromafar said in Xplotter like software for linux:
yeah the *nix support isn't the best but I wouldn't worry to much about the optimising it does help read speeds but shouldn't be a bottleneck.
it is when using large disks, i want to be under 60sec read time
@vaxman said in Xplotter like software for linux:
@felixbrucker said in Xplotter like software for linux:
[stagger size] = set it to 2x the amount of MB RAM your system has (with async 1x the RAM your system has)
suggests its not possible to generate optimized plots? stagger size needs to be 8TB for optimized plots on a 8TB disk
There are two ways;
either you throw away a lot of computation power by generating the data stream for an "optimized" file
or
you pre-allocate the filespace (with all-zeroes), then compute all nonces and write them to disk, into
the pre-allocated file, jumping all over the place. (obviously doesn't work on compressed filesystems).The algo computes all nonces with a stagger of ONE.
All plotting software out there "optimizes" that, giving you a parameter for RAM usage (stagger).well i dont know how blagos xplotter does it but im able to plot optimized 8tb plots with only 500mb ram (stagger size equivalent to 8tb), seems i cant do it with linux?
-
@felixbrucker
You can do with linux, but indirectly.
Plot whatever you can onto a temporary target, optimize from there to the disk that will be mined.The temp can be much smaller, and therefore defines your maximum filesize.
-
@felixbrucker
https://forums.burst-team.us/topic/5902/linux-plotting-mining/32
https://github.com/k06a/mjminer/tree/fix/optimize
-
@Blago thanks, this seems to be the only solution (and only yesterday popped up)?
so it seems my assumption was correct till yesterday :D
-
@vaxman said in Xplotter like software for linux:
@felixbrucker
You can do with linux, but indirectly.
Plot whatever you can onto a temporary target, optimize from there to the disk that will be mined.The temp can be much smaller, and therefore defines your maximum filesize.
problem with this approach is: i dont have a large enough temp space,i want a single plot, so the disk head only has to move once per round

