When the grain’s Size would cause your Time setting to access out-of-bounds buffer positions, Beads seems to limit the Time setting to always respect the Size setting. This makes sense because Beads is not designed to allow wrapping back to the other end of the buffer (avoiding the inherent discontinuities).
However, some of my applications rely on grabbing grains from very particular regions of the buffer. I would like the leeway to not have to fine tune Size to ensure I am not going to get an “inaccurate” Time setting (with respect to my pre-calculated Time CV).
Could it be possible to reverse this scheme such that Size yields as needed to Time? We are not yet privy to Beads’ firmware, so I don’t know how deep of a change this would represent.