Beads prioritizes Size over Time

When the grain’s Size would cause your Time setting to access out-of-bounds buffer positions, Beads seems to limit the Time setting to always respect the Size setting. This makes sense because Beads is not designed to allow wrapping back to the other end of the buffer (avoiding the inherent discontinuities).

However, some of my applications rely on grabbing grains from very particular regions of the buffer. I would like the leeway to not have to fine tune Size to ensure I am not going to get an “inaccurate” Time setting (with respect to my pre-calculated Time CV).

Could it be possible to reverse this scheme such that Size yields as needed to Time? We are not yet privy to Beads’ firmware, so I don’t know how deep of a change this would represent.

There are two lines in the code controlling the behavior (increasing the time to avoid bumping in the “head” of the buffer, and decreasing the time to avoid bumping in the “tail” of the buffer).

They can be commented, in which case the grain will quickly fade out when approaching the head or tail.

I plan to publish the code in 2 or 3 months (when the next batch is sent to dealers).

3 Likes

Excellent! It’s no rush to me, I just have been developing a “Beads automator” Teletype script which acts as a tempo-synced sample acquisition and playback scrubber. Probably I should use the next 2-3 months to explore the sample slicing delay mode (who knows why I’m trying to replicate it myself…).

P.S. - Happy cake day