Larger stagger should read faster, if only because there are fewer seek operations on the disk. Ideally you could run an optimizer program (dcct tools etc) on the plot to do that for you. But you'll need enough free space for both the original and the optimized plot.
If you use too much ram in the plotting process or optimization process you can run out of RAM and force the computer to use swap space instead, which can slow things down a LOT.
Just something to keep in mind.
My process if I had one 3TB and a random amount of space on another disk:
- Plot to random space as much as you can fit.
- Optimize the plot to the 3TB drive.
- repeat until the drive is full.
If your random space drive only has 100GB free, that's 30 files, which might not be any better than leaving it unoptimized. But if you have 500GB or 1TB free elsewhere, you'd be down to 6 or 3 files, and that should be pretty speedy!
Ideally you'd want 1 fully optimized file per disk where the stagger size is the same as the number of nonces, but you do what you can with what you have.